Nov 28 10:15:02 np0005538960 kernel: Linux version 5.14.0-642.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-68.el9) #1 SMP PREEMPT_DYNAMIC Thu Nov 20 14:15:03 UTC 2025
Nov 28 10:15:02 np0005538960 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Nov 28 10:15:02 np0005538960 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 10:15:02 np0005538960 kernel: BIOS-provided physical RAM map:
Nov 28 10:15:02 np0005538960 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Nov 28 10:15:02 np0005538960 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Nov 28 10:15:02 np0005538960 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Nov 28 10:15:02 np0005538960 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Nov 28 10:15:02 np0005538960 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Nov 28 10:15:02 np0005538960 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Nov 28 10:15:02 np0005538960 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Nov 28 10:15:02 np0005538960 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Nov 28 10:15:02 np0005538960 kernel: NX (Execute Disable) protection: active
Nov 28 10:15:02 np0005538960 kernel: APIC: Static calls initialized
Nov 28 10:15:02 np0005538960 kernel: SMBIOS 2.8 present.
Nov 28 10:15:02 np0005538960 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Nov 28 10:15:02 np0005538960 kernel: Hypervisor detected: KVM
Nov 28 10:15:02 np0005538960 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Nov 28 10:15:02 np0005538960 kernel: kvm-clock: using sched offset of 3241015182 cycles
Nov 28 10:15:02 np0005538960 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Nov 28 10:15:02 np0005538960 kernel: tsc: Detected 2799.998 MHz processor
Nov 28 10:15:02 np0005538960 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Nov 28 10:15:02 np0005538960 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Nov 28 10:15:02 np0005538960 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Nov 28 10:15:02 np0005538960 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Nov 28 10:15:02 np0005538960 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Nov 28 10:15:02 np0005538960 kernel: Using GB pages for direct mapping
Nov 28 10:15:02 np0005538960 kernel: RAMDISK: [mem 0x2d83a000-0x32c14fff]
Nov 28 10:15:02 np0005538960 kernel: ACPI: Early table checksum verification disabled
Nov 28 10:15:02 np0005538960 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Nov 28 10:15:02 np0005538960 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 10:15:02 np0005538960 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 10:15:02 np0005538960 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 10:15:02 np0005538960 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Nov 28 10:15:02 np0005538960 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 10:15:02 np0005538960 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Nov 28 10:15:02 np0005538960 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Nov 28 10:15:02 np0005538960 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Nov 28 10:15:02 np0005538960 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Nov 28 10:15:02 np0005538960 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Nov 28 10:15:02 np0005538960 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Nov 28 10:15:02 np0005538960 kernel: No NUMA configuration found
Nov 28 10:15:02 np0005538960 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Nov 28 10:15:02 np0005538960 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Nov 28 10:15:02 np0005538960 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Nov 28 10:15:02 np0005538960 kernel: Zone ranges:
Nov 28 10:15:02 np0005538960 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Nov 28 10:15:02 np0005538960 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Nov 28 10:15:02 np0005538960 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Nov 28 10:15:02 np0005538960 kernel:  Device   empty
Nov 28 10:15:02 np0005538960 kernel: Movable zone start for each node
Nov 28 10:15:02 np0005538960 kernel: Early memory node ranges
Nov 28 10:15:02 np0005538960 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Nov 28 10:15:02 np0005538960 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Nov 28 10:15:02 np0005538960 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Nov 28 10:15:02 np0005538960 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Nov 28 10:15:02 np0005538960 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Nov 28 10:15:02 np0005538960 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Nov 28 10:15:02 np0005538960 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Nov 28 10:15:02 np0005538960 kernel: ACPI: PM-Timer IO Port: 0x608
Nov 28 10:15:02 np0005538960 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Nov 28 10:15:02 np0005538960 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Nov 28 10:15:02 np0005538960 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Nov 28 10:15:02 np0005538960 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Nov 28 10:15:02 np0005538960 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Nov 28 10:15:02 np0005538960 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Nov 28 10:15:02 np0005538960 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Nov 28 10:15:02 np0005538960 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Nov 28 10:15:02 np0005538960 kernel: TSC deadline timer available
Nov 28 10:15:02 np0005538960 kernel: CPU topo: Max. logical packages:   8
Nov 28 10:15:02 np0005538960 kernel: CPU topo: Max. logical dies:       8
Nov 28 10:15:02 np0005538960 kernel: CPU topo: Max. dies per package:   1
Nov 28 10:15:02 np0005538960 kernel: CPU topo: Max. threads per core:   1
Nov 28 10:15:02 np0005538960 kernel: CPU topo: Num. cores per package:     1
Nov 28 10:15:02 np0005538960 kernel: CPU topo: Num. threads per package:   1
Nov 28 10:15:02 np0005538960 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Nov 28 10:15:02 np0005538960 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Nov 28 10:15:02 np0005538960 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Nov 28 10:15:02 np0005538960 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Nov 28 10:15:02 np0005538960 kernel: Booting paravirtualized kernel on KVM
Nov 28 10:15:02 np0005538960 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Nov 28 10:15:02 np0005538960 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Nov 28 10:15:02 np0005538960 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Nov 28 10:15:02 np0005538960 kernel: kvm-guest: PV spinlocks disabled, no host support
Nov 28 10:15:02 np0005538960 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 10:15:02 np0005538960 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64", will be passed to user space.
Nov 28 10:15:02 np0005538960 kernel: random: crng init done
Nov 28 10:15:02 np0005538960 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: Fallback order for Node 0: 0 
Nov 28 10:15:02 np0005538960 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Nov 28 10:15:02 np0005538960 kernel: Policy zone: Normal
Nov 28 10:15:02 np0005538960 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Nov 28 10:15:02 np0005538960 kernel: software IO TLB: area num 8.
Nov 28 10:15:02 np0005538960 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Nov 28 10:15:02 np0005538960 kernel: ftrace: allocating 49313 entries in 193 pages
Nov 28 10:15:02 np0005538960 kernel: ftrace: allocated 193 pages with 3 groups
Nov 28 10:15:02 np0005538960 kernel: Dynamic Preempt: voluntary
Nov 28 10:15:02 np0005538960 kernel: rcu: Preemptible hierarchical RCU implementation.
Nov 28 10:15:02 np0005538960 kernel: rcu: #011RCU event tracing is enabled.
Nov 28 10:15:02 np0005538960 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Nov 28 10:15:02 np0005538960 kernel: #011Trampoline variant of Tasks RCU enabled.
Nov 28 10:15:02 np0005538960 kernel: #011Rude variant of Tasks RCU enabled.
Nov 28 10:15:02 np0005538960 kernel: #011Tracing variant of Tasks RCU enabled.
Nov 28 10:15:02 np0005538960 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Nov 28 10:15:02 np0005538960 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Nov 28 10:15:02 np0005538960 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 10:15:02 np0005538960 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 10:15:02 np0005538960 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Nov 28 10:15:02 np0005538960 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Nov 28 10:15:02 np0005538960 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Nov 28 10:15:02 np0005538960 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Nov 28 10:15:02 np0005538960 kernel: Console: colour VGA+ 80x25
Nov 28 10:15:02 np0005538960 kernel: printk: console [ttyS0] enabled
Nov 28 10:15:02 np0005538960 kernel: ACPI: Core revision 20230331
Nov 28 10:15:02 np0005538960 kernel: APIC: Switch to symmetric I/O mode setup
Nov 28 10:15:02 np0005538960 kernel: x2apic enabled
Nov 28 10:15:02 np0005538960 kernel: APIC: Switched APIC routing to: physical x2apic
Nov 28 10:15:02 np0005538960 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Nov 28 10:15:02 np0005538960 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Nov 28 10:15:02 np0005538960 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Nov 28 10:15:02 np0005538960 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Nov 28 10:15:02 np0005538960 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Nov 28 10:15:02 np0005538960 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Nov 28 10:15:02 np0005538960 kernel: Spectre V2 : Mitigation: Retpolines
Nov 28 10:15:02 np0005538960 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Nov 28 10:15:02 np0005538960 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Nov 28 10:15:02 np0005538960 kernel: RETBleed: Mitigation: untrained return thunk
Nov 28 10:15:02 np0005538960 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Nov 28 10:15:02 np0005538960 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Nov 28 10:15:02 np0005538960 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Nov 28 10:15:02 np0005538960 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Nov 28 10:15:02 np0005538960 kernel: x86/bugs: return thunk changed
Nov 28 10:15:02 np0005538960 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Nov 28 10:15:02 np0005538960 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Nov 28 10:15:02 np0005538960 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Nov 28 10:15:02 np0005538960 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Nov 28 10:15:02 np0005538960 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Nov 28 10:15:02 np0005538960 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Nov 28 10:15:02 np0005538960 kernel: Freeing SMP alternatives memory: 40K
Nov 28 10:15:02 np0005538960 kernel: pid_max: default: 32768 minimum: 301
Nov 28 10:15:02 np0005538960 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Nov 28 10:15:02 np0005538960 kernel: landlock: Up and running.
Nov 28 10:15:02 np0005538960 kernel: Yama: becoming mindful.
Nov 28 10:15:02 np0005538960 kernel: SELinux:  Initializing.
Nov 28 10:15:02 np0005538960 kernel: LSM support for eBPF active
Nov 28 10:15:02 np0005538960 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Nov 28 10:15:02 np0005538960 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Nov 28 10:15:02 np0005538960 kernel: ... version:                0
Nov 28 10:15:02 np0005538960 kernel: ... bit width:              48
Nov 28 10:15:02 np0005538960 kernel: ... generic registers:      6
Nov 28 10:15:02 np0005538960 kernel: ... value mask:             0000ffffffffffff
Nov 28 10:15:02 np0005538960 kernel: ... max period:             00007fffffffffff
Nov 28 10:15:02 np0005538960 kernel: ... fixed-purpose events:   0
Nov 28 10:15:02 np0005538960 kernel: ... event mask:             000000000000003f
Nov 28 10:15:02 np0005538960 kernel: signal: max sigframe size: 1776
Nov 28 10:15:02 np0005538960 kernel: rcu: Hierarchical SRCU implementation.
Nov 28 10:15:02 np0005538960 kernel: rcu: #011Max phase no-delay instances is 400.
Nov 28 10:15:02 np0005538960 kernel: smp: Bringing up secondary CPUs ...
Nov 28 10:15:02 np0005538960 kernel: smpboot: x86: Booting SMP configuration:
Nov 28 10:15:02 np0005538960 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Nov 28 10:15:02 np0005538960 kernel: smp: Brought up 1 node, 8 CPUs
Nov 28 10:15:02 np0005538960 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Nov 28 10:15:02 np0005538960 kernel: node 0 deferred pages initialised in 8ms
Nov 28 10:15:02 np0005538960 kernel: Memory: 7765840K/8388068K available (16384K kernel code, 5787K rwdata, 13900K rodata, 4192K init, 7172K bss, 616276K reserved, 0K cma-reserved)
Nov 28 10:15:02 np0005538960 kernel: devtmpfs: initialized
Nov 28 10:15:02 np0005538960 kernel: x86/mm: Memory block size: 128MB
Nov 28 10:15:02 np0005538960 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Nov 28 10:15:02 np0005538960 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: pinctrl core: initialized pinctrl subsystem
Nov 28 10:15:02 np0005538960 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Nov 28 10:15:02 np0005538960 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Nov 28 10:15:02 np0005538960 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Nov 28 10:15:02 np0005538960 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Nov 28 10:15:02 np0005538960 kernel: audit: initializing netlink subsys (disabled)
Nov 28 10:15:02 np0005538960 kernel: audit: type=2000 audit(1764342900.914:1): state=initialized audit_enabled=0 res=1
Nov 28 10:15:02 np0005538960 kernel: thermal_sys: Registered thermal governor 'fair_share'
Nov 28 10:15:02 np0005538960 kernel: thermal_sys: Registered thermal governor 'step_wise'
Nov 28 10:15:02 np0005538960 kernel: thermal_sys: Registered thermal governor 'user_space'
Nov 28 10:15:02 np0005538960 kernel: cpuidle: using governor menu
Nov 28 10:15:02 np0005538960 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Nov 28 10:15:02 np0005538960 kernel: PCI: Using configuration type 1 for base access
Nov 28 10:15:02 np0005538960 kernel: PCI: Using configuration type 1 for extended access
Nov 28 10:15:02 np0005538960 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Nov 28 10:15:02 np0005538960 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Nov 28 10:15:02 np0005538960 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Nov 28 10:15:02 np0005538960 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Nov 28 10:15:02 np0005538960 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Nov 28 10:15:02 np0005538960 kernel: Demotion targets for Node 0: null
Nov 28 10:15:02 np0005538960 kernel: cryptd: max_cpu_qlen set to 1000
Nov 28 10:15:02 np0005538960 kernel: ACPI: Added _OSI(Module Device)
Nov 28 10:15:02 np0005538960 kernel: ACPI: Added _OSI(Processor Device)
Nov 28 10:15:02 np0005538960 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Nov 28 10:15:02 np0005538960 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Nov 28 10:15:02 np0005538960 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Nov 28 10:15:02 np0005538960 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Nov 28 10:15:02 np0005538960 kernel: ACPI: Interpreter enabled
Nov 28 10:15:02 np0005538960 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Nov 28 10:15:02 np0005538960 kernel: ACPI: Using IOAPIC for interrupt routing
Nov 28 10:15:02 np0005538960 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Nov 28 10:15:02 np0005538960 kernel: PCI: Using E820 reservations for host bridge windows
Nov 28 10:15:02 np0005538960 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Nov 28 10:15:02 np0005538960 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Nov 28 10:15:02 np0005538960 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [3] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [4] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [5] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [6] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [7] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [8] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [9] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [10] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [11] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [12] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [13] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [14] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [15] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [16] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [17] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [18] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [19] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [20] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [21] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [22] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [23] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [24] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [25] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [26] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [27] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [28] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [29] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [30] registered
Nov 28 10:15:02 np0005538960 kernel: acpiphp: Slot [31] registered
Nov 28 10:15:02 np0005538960 kernel: PCI host bridge to bus 0000:00
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Nov 28 10:15:02 np0005538960 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Nov 28 10:15:02 np0005538960 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Nov 28 10:15:02 np0005538960 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Nov 28 10:15:02 np0005538960 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Nov 28 10:15:02 np0005538960 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Nov 28 10:15:02 np0005538960 kernel: iommu: Default domain type: Translated
Nov 28 10:15:02 np0005538960 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Nov 28 10:15:02 np0005538960 kernel: SCSI subsystem initialized
Nov 28 10:15:02 np0005538960 kernel: ACPI: bus type USB registered
Nov 28 10:15:02 np0005538960 kernel: usbcore: registered new interface driver usbfs
Nov 28 10:15:02 np0005538960 kernel: usbcore: registered new interface driver hub
Nov 28 10:15:02 np0005538960 kernel: usbcore: registered new device driver usb
Nov 28 10:15:02 np0005538960 kernel: pps_core: LinuxPPS API ver. 1 registered
Nov 28 10:15:02 np0005538960 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Nov 28 10:15:02 np0005538960 kernel: PTP clock support registered
Nov 28 10:15:02 np0005538960 kernel: EDAC MC: Ver: 3.0.0
Nov 28 10:15:02 np0005538960 kernel: NetLabel: Initializing
Nov 28 10:15:02 np0005538960 kernel: NetLabel:  domain hash size = 128
Nov 28 10:15:02 np0005538960 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Nov 28 10:15:02 np0005538960 kernel: NetLabel:  unlabeled traffic allowed by default
Nov 28 10:15:02 np0005538960 kernel: PCI: Using ACPI for IRQ routing
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Nov 28 10:15:02 np0005538960 kernel: vgaarb: loaded
Nov 28 10:15:02 np0005538960 kernel: clocksource: Switched to clocksource kvm-clock
Nov 28 10:15:02 np0005538960 kernel: VFS: Disk quotas dquot_6.6.0
Nov 28 10:15:02 np0005538960 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Nov 28 10:15:02 np0005538960 kernel: pnp: PnP ACPI init
Nov 28 10:15:02 np0005538960 kernel: pnp: PnP ACPI: found 5 devices
Nov 28 10:15:02 np0005538960 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Nov 28 10:15:02 np0005538960 kernel: NET: Registered PF_INET protocol family
Nov 28 10:15:02 np0005538960 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Nov 28 10:15:02 np0005538960 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Nov 28 10:15:02 np0005538960 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Nov 28 10:15:02 np0005538960 kernel: NET: Registered PF_XDP protocol family
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Nov 28 10:15:02 np0005538960 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Nov 28 10:15:02 np0005538960 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Nov 28 10:15:02 np0005538960 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 75333 usecs
Nov 28 10:15:02 np0005538960 kernel: PCI: CLS 0 bytes, default 64
Nov 28 10:15:02 np0005538960 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Nov 28 10:15:02 np0005538960 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Nov 28 10:15:02 np0005538960 kernel: ACPI: bus type thunderbolt registered
Nov 28 10:15:02 np0005538960 kernel: Trying to unpack rootfs image as initramfs...
Nov 28 10:15:02 np0005538960 kernel: Initialise system trusted keyrings
Nov 28 10:15:02 np0005538960 kernel: Key type blacklist registered
Nov 28 10:15:02 np0005538960 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Nov 28 10:15:02 np0005538960 kernel: zbud: loaded
Nov 28 10:15:02 np0005538960 kernel: integrity: Platform Keyring initialized
Nov 28 10:15:02 np0005538960 kernel: integrity: Machine keyring initialized
Nov 28 10:15:02 np0005538960 kernel: Freeing initrd memory: 85868K
Nov 28 10:15:02 np0005538960 kernel: NET: Registered PF_ALG protocol family
Nov 28 10:15:02 np0005538960 kernel: xor: automatically using best checksumming function   avx       
Nov 28 10:15:02 np0005538960 kernel: Key type asymmetric registered
Nov 28 10:15:02 np0005538960 kernel: Asymmetric key parser 'x509' registered
Nov 28 10:15:02 np0005538960 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Nov 28 10:15:02 np0005538960 kernel: io scheduler mq-deadline registered
Nov 28 10:15:02 np0005538960 kernel: io scheduler kyber registered
Nov 28 10:15:02 np0005538960 kernel: io scheduler bfq registered
Nov 28 10:15:02 np0005538960 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Nov 28 10:15:02 np0005538960 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Nov 28 10:15:02 np0005538960 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Nov 28 10:15:02 np0005538960 kernel: ACPI: button: Power Button [PWRF]
Nov 28 10:15:02 np0005538960 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Nov 28 10:15:02 np0005538960 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Nov 28 10:15:02 np0005538960 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Nov 28 10:15:02 np0005538960 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Nov 28 10:15:02 np0005538960 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Nov 28 10:15:02 np0005538960 kernel: Non-volatile memory driver v1.3
Nov 28 10:15:02 np0005538960 kernel: rdac: device handler registered
Nov 28 10:15:02 np0005538960 kernel: hp_sw: device handler registered
Nov 28 10:15:02 np0005538960 kernel: emc: device handler registered
Nov 28 10:15:02 np0005538960 kernel: alua: device handler registered
Nov 28 10:15:02 np0005538960 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Nov 28 10:15:02 np0005538960 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Nov 28 10:15:02 np0005538960 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Nov 28 10:15:02 np0005538960 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Nov 28 10:15:02 np0005538960 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Nov 28 10:15:02 np0005538960 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Nov 28 10:15:02 np0005538960 kernel: usb usb1: Product: UHCI Host Controller
Nov 28 10:15:02 np0005538960 kernel: usb usb1: Manufacturer: Linux 5.14.0-642.el9.x86_64 uhci_hcd
Nov 28 10:15:02 np0005538960 kernel: usb usb1: SerialNumber: 0000:00:01.2
Nov 28 10:15:02 np0005538960 kernel: hub 1-0:1.0: USB hub found
Nov 28 10:15:02 np0005538960 kernel: hub 1-0:1.0: 2 ports detected
Nov 28 10:15:02 np0005538960 kernel: usbcore: registered new interface driver usbserial_generic
Nov 28 10:15:02 np0005538960 kernel: usbserial: USB Serial support registered for generic
Nov 28 10:15:02 np0005538960 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Nov 28 10:15:02 np0005538960 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Nov 28 10:15:02 np0005538960 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Nov 28 10:15:02 np0005538960 kernel: mousedev: PS/2 mouse device common for all mice
Nov 28 10:15:02 np0005538960 kernel: rtc_cmos 00:04: RTC can wake from S4
Nov 28 10:15:02 np0005538960 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Nov 28 10:15:02 np0005538960 kernel: rtc_cmos 00:04: registered as rtc0
Nov 28 10:15:02 np0005538960 kernel: rtc_cmos 00:04: setting system clock to 2025-11-28T15:15:01 UTC (1764342901)
Nov 28 10:15:02 np0005538960 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Nov 28 10:15:02 np0005538960 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Nov 28 10:15:02 np0005538960 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Nov 28 10:15:02 np0005538960 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Nov 28 10:15:02 np0005538960 kernel: hid: raw HID events driver (C) Jiri Kosina
Nov 28 10:15:02 np0005538960 kernel: usbcore: registered new interface driver usbhid
Nov 28 10:15:02 np0005538960 kernel: usbhid: USB HID core driver
Nov 28 10:15:02 np0005538960 kernel: drop_monitor: Initializing network drop monitor service
Nov 28 10:15:02 np0005538960 kernel: Initializing XFRM netlink socket
Nov 28 10:15:02 np0005538960 kernel: NET: Registered PF_INET6 protocol family
Nov 28 10:15:02 np0005538960 kernel: Segment Routing with IPv6
Nov 28 10:15:02 np0005538960 kernel: NET: Registered PF_PACKET protocol family
Nov 28 10:15:02 np0005538960 kernel: mpls_gso: MPLS GSO support
Nov 28 10:15:02 np0005538960 kernel: IPI shorthand broadcast: enabled
Nov 28 10:15:02 np0005538960 kernel: AVX2 version of gcm_enc/dec engaged.
Nov 28 10:15:02 np0005538960 kernel: AES CTR mode by8 optimization enabled
Nov 28 10:15:02 np0005538960 kernel: sched_clock: Marking stable (1180010063, 150660148)->(1446434173, -115763962)
Nov 28 10:15:02 np0005538960 kernel: registered taskstats version 1
Nov 28 10:15:02 np0005538960 kernel: Loading compiled-in X.509 certificates
Nov 28 10:15:02 np0005538960 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 28 10:15:02 np0005538960 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Nov 28 10:15:02 np0005538960 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Nov 28 10:15:02 np0005538960 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Nov 28 10:15:02 np0005538960 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Nov 28 10:15:02 np0005538960 kernel: Demotion targets for Node 0: null
Nov 28 10:15:02 np0005538960 kernel: page_owner is disabled
Nov 28 10:15:02 np0005538960 kernel: Key type .fscrypt registered
Nov 28 10:15:02 np0005538960 kernel: Key type fscrypt-provisioning registered
Nov 28 10:15:02 np0005538960 kernel: Key type big_key registered
Nov 28 10:15:02 np0005538960 kernel: Key type encrypted registered
Nov 28 10:15:02 np0005538960 kernel: ima: No TPM chip found, activating TPM-bypass!
Nov 28 10:15:02 np0005538960 kernel: Loading compiled-in module X.509 certificates
Nov 28 10:15:02 np0005538960 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 8ec4bd273f582f9a9b9a494ae677ca1f1488f19e'
Nov 28 10:15:02 np0005538960 kernel: ima: Allocated hash algorithm: sha256
Nov 28 10:15:02 np0005538960 kernel: ima: No architecture policies found
Nov 28 10:15:02 np0005538960 kernel: evm: Initialising EVM extended attributes:
Nov 28 10:15:02 np0005538960 kernel: evm: security.selinux
Nov 28 10:15:02 np0005538960 kernel: evm: security.SMACK64 (disabled)
Nov 28 10:15:02 np0005538960 kernel: evm: security.SMACK64EXEC (disabled)
Nov 28 10:15:02 np0005538960 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Nov 28 10:15:02 np0005538960 kernel: evm: security.SMACK64MMAP (disabled)
Nov 28 10:15:02 np0005538960 kernel: evm: security.apparmor (disabled)
Nov 28 10:15:02 np0005538960 kernel: evm: security.ima
Nov 28 10:15:02 np0005538960 kernel: evm: security.capability
Nov 28 10:15:02 np0005538960 kernel: evm: HMAC attrs: 0x1
Nov 28 10:15:02 np0005538960 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Nov 28 10:15:02 np0005538960 kernel: Running certificate verification RSA selftest
Nov 28 10:15:02 np0005538960 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Nov 28 10:15:02 np0005538960 kernel: Running certificate verification ECDSA selftest
Nov 28 10:15:02 np0005538960 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Nov 28 10:15:02 np0005538960 kernel: clk: Disabling unused clocks
Nov 28 10:15:02 np0005538960 kernel: Freeing unused decrypted memory: 2028K
Nov 28 10:15:02 np0005538960 kernel: Freeing unused kernel image (initmem) memory: 4192K
Nov 28 10:15:02 np0005538960 kernel: Write protecting the kernel read-only data: 30720k
Nov 28 10:15:02 np0005538960 kernel: Freeing unused kernel image (rodata/data gap) memory: 436K
Nov 28 10:15:02 np0005538960 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Nov 28 10:15:02 np0005538960 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Nov 28 10:15:02 np0005538960 kernel: usb 1-1: Product: QEMU USB Tablet
Nov 28 10:15:02 np0005538960 kernel: usb 1-1: Manufacturer: QEMU
Nov 28 10:15:02 np0005538960 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Nov 28 10:15:02 np0005538960 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Nov 28 10:15:02 np0005538960 kernel: Run /init as init process
Nov 28 10:15:02 np0005538960 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Nov 28 10:15:02 np0005538960 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Nov 28 10:15:02 np0005538960 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 10:15:02 np0005538960 systemd: Detected virtualization kvm.
Nov 28 10:15:02 np0005538960 systemd: Detected architecture x86-64.
Nov 28 10:15:02 np0005538960 systemd: Running in initrd.
Nov 28 10:15:02 np0005538960 systemd: No hostname configured, using default hostname.
Nov 28 10:15:02 np0005538960 systemd: Hostname set to <localhost>.
Nov 28 10:15:02 np0005538960 systemd: Initializing machine ID from VM UUID.
Nov 28 10:15:02 np0005538960 systemd: Queued start job for default target Initrd Default Target.
Nov 28 10:15:02 np0005538960 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 10:15:02 np0005538960 systemd: Reached target Local Encrypted Volumes.
Nov 28 10:15:02 np0005538960 systemd: Reached target Initrd /usr File System.
Nov 28 10:15:02 np0005538960 systemd: Reached target Local File Systems.
Nov 28 10:15:02 np0005538960 systemd: Reached target Path Units.
Nov 28 10:15:02 np0005538960 systemd: Reached target Slice Units.
Nov 28 10:15:02 np0005538960 systemd: Reached target Swaps.
Nov 28 10:15:02 np0005538960 systemd: Reached target Timer Units.
Nov 28 10:15:02 np0005538960 systemd: Listening on D-Bus System Message Bus Socket.
Nov 28 10:15:02 np0005538960 systemd: Listening on Journal Socket (/dev/log).
Nov 28 10:15:02 np0005538960 systemd: Listening on Journal Socket.
Nov 28 10:15:02 np0005538960 systemd: Listening on udev Control Socket.
Nov 28 10:15:02 np0005538960 systemd: Listening on udev Kernel Socket.
Nov 28 10:15:02 np0005538960 systemd: Reached target Socket Units.
Nov 28 10:15:02 np0005538960 systemd: Starting Create List of Static Device Nodes...
Nov 28 10:15:02 np0005538960 systemd: Starting Journal Service...
Nov 28 10:15:02 np0005538960 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 28 10:15:02 np0005538960 systemd: Starting Apply Kernel Variables...
Nov 28 10:15:02 np0005538960 systemd: Starting Create System Users...
Nov 28 10:15:02 np0005538960 systemd: Starting Setup Virtual Console...
Nov 28 10:15:02 np0005538960 systemd: Finished Create List of Static Device Nodes.
Nov 28 10:15:02 np0005538960 systemd: Finished Apply Kernel Variables.
Nov 28 10:15:02 np0005538960 systemd: Finished Create System Users.
Nov 28 10:15:02 np0005538960 systemd-journald[306]: Journal started
Nov 28 10:15:02 np0005538960 systemd-journald[306]: Runtime Journal (/run/log/journal/514cc5622bfa41c3bde8d8f80e6fac29) is 8.0M, max 153.6M, 145.6M free.
Nov 28 10:15:02 np0005538960 systemd-sysusers[311]: Creating group 'users' with GID 100.
Nov 28 10:15:02 np0005538960 systemd-sysusers[311]: Creating group 'dbus' with GID 81.
Nov 28 10:15:02 np0005538960 systemd-sysusers[311]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Nov 28 10:15:02 np0005538960 systemd: Started Journal Service.
Nov 28 10:15:02 np0005538960 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 10:15:02 np0005538960 systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 10:15:02 np0005538960 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 10:15:02 np0005538960 systemd[1]: Finished Setup Virtual Console.
Nov 28 10:15:02 np0005538960 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Nov 28 10:15:02 np0005538960 systemd[1]: Starting dracut cmdline hook...
Nov 28 10:15:02 np0005538960 dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Nov 28 10:15:02 np0005538960 systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 10:15:02 np0005538960 dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-642.el9.x86_64 root=UUID=b277050f-8ace-464d-abb6-4c46d4c45253 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Nov 28 10:15:02 np0005538960 systemd[1]: Finished dracut cmdline hook.
Nov 28 10:15:02 np0005538960 systemd[1]: Starting dracut pre-udev hook...
Nov 28 10:15:02 np0005538960 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Nov 28 10:15:02 np0005538960 kernel: device-mapper: uevent: version 1.0.3
Nov 28 10:15:02 np0005538960 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Nov 28 10:15:02 np0005538960 kernel: RPC: Registered named UNIX socket transport module.
Nov 28 10:15:02 np0005538960 kernel: RPC: Registered udp transport module.
Nov 28 10:15:02 np0005538960 kernel: RPC: Registered tcp transport module.
Nov 28 10:15:02 np0005538960 kernel: RPC: Registered tcp-with-tls transport module.
Nov 28 10:15:02 np0005538960 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Nov 28 10:15:02 np0005538960 rpc.statd[443]: Version 2.5.4 starting
Nov 28 10:15:02 np0005538960 rpc.statd[443]: Initializing NSM state
Nov 28 10:15:02 np0005538960 rpc.idmapd[448]: Setting log level to 0
Nov 28 10:15:02 np0005538960 systemd[1]: Finished dracut pre-udev hook.
Nov 28 10:15:02 np0005538960 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 10:15:02 np0005538960 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 10:15:02 np0005538960 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 10:15:02 np0005538960 systemd[1]: Starting dracut pre-trigger hook...
Nov 28 10:15:02 np0005538960 systemd[1]: Finished dracut pre-trigger hook.
Nov 28 10:15:02 np0005538960 systemd[1]: Starting Coldplug All udev Devices...
Nov 28 10:15:03 np0005538960 systemd[1]: Created slice Slice /system/modprobe.
Nov 28 10:15:03 np0005538960 systemd[1]: Starting Load Kernel Module configfs...
Nov 28 10:15:03 np0005538960 systemd[1]: Finished Coldplug All udev Devices.
Nov 28 10:15:03 np0005538960 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 10:15:03 np0005538960 systemd[1]: Finished Load Kernel Module configfs.
Nov 28 10:15:03 np0005538960 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 10:15:03 np0005538960 systemd[1]: Reached target Network.
Nov 28 10:15:03 np0005538960 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Nov 28 10:15:03 np0005538960 systemd[1]: Starting dracut initqueue hook...
Nov 28 10:15:03 np0005538960 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Nov 28 10:15:03 np0005538960 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Nov 28 10:15:03 np0005538960 kernel: vda: vda1
Nov 28 10:15:03 np0005538960 systemd[1]: Mounting Kernel Configuration File System...
Nov 28 10:15:03 np0005538960 systemd[1]: Mounted Kernel Configuration File System.
Nov 28 10:15:03 np0005538960 systemd[1]: Reached target System Initialization.
Nov 28 10:15:03 np0005538960 systemd[1]: Reached target Basic System.
Nov 28 10:15:03 np0005538960 kernel: scsi host0: ata_piix
Nov 28 10:15:03 np0005538960 kernel: scsi host1: ata_piix
Nov 28 10:15:03 np0005538960 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Nov 28 10:15:03 np0005538960 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Nov 28 10:15:03 np0005538960 systemd-udevd[463]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:15:03 np0005538960 systemd[1]: Found device /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 28 10:15:03 np0005538960 systemd[1]: Reached target Initrd Root Device.
Nov 28 10:15:03 np0005538960 kernel: ata1: found unknown device (class 0)
Nov 28 10:15:03 np0005538960 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Nov 28 10:15:03 np0005538960 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Nov 28 10:15:03 np0005538960 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Nov 28 10:15:03 np0005538960 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Nov 28 10:15:03 np0005538960 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Nov 28 10:15:03 np0005538960 systemd[1]: Finished dracut initqueue hook.
Nov 28 10:15:03 np0005538960 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 10:15:03 np0005538960 systemd[1]: Reached target Remote Encrypted Volumes.
Nov 28 10:15:03 np0005538960 systemd[1]: Reached target Remote File Systems.
Nov 28 10:15:03 np0005538960 systemd[1]: Starting dracut pre-mount hook...
Nov 28 10:15:03 np0005538960 systemd[1]: Finished dracut pre-mount hook.
Nov 28 10:15:03 np0005538960 systemd[1]: Starting File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253...
Nov 28 10:15:03 np0005538960 systemd-fsck[557]: /usr/sbin/fsck.xfs: XFS file system.
Nov 28 10:15:03 np0005538960 systemd[1]: Finished File System Check on /dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253.
Nov 28 10:15:03 np0005538960 systemd[1]: Mounting /sysroot...
Nov 28 10:15:04 np0005538960 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Nov 28 10:15:04 np0005538960 kernel: XFS (vda1): Mounting V5 Filesystem b277050f-8ace-464d-abb6-4c46d4c45253
Nov 28 10:15:04 np0005538960 kernel: XFS (vda1): Ending clean mount
Nov 28 10:15:04 np0005538960 systemd[1]: Mounted /sysroot.
Nov 28 10:15:04 np0005538960 systemd[1]: Reached target Initrd Root File System.
Nov 28 10:15:04 np0005538960 systemd[1]: Starting Mountpoints Configured in the Real Root...
Nov 28 10:15:04 np0005538960 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Finished Mountpoints Configured in the Real Root.
Nov 28 10:15:04 np0005538960 systemd[1]: Reached target Initrd File Systems.
Nov 28 10:15:04 np0005538960 systemd[1]: Reached target Initrd Default Target.
Nov 28 10:15:04 np0005538960 systemd[1]: Starting dracut mount hook...
Nov 28 10:15:04 np0005538960 systemd[1]: Finished dracut mount hook.
Nov 28 10:15:04 np0005538960 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Nov 28 10:15:04 np0005538960 rpc.idmapd[448]: exiting on signal 15
Nov 28 10:15:04 np0005538960 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Nov 28 10:15:04 np0005538960 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Network.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Remote Encrypted Volumes.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Timer Units.
Nov 28 10:15:04 np0005538960 systemd[1]: dbus.socket: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Closed D-Bus System Message Bus Socket.
Nov 28 10:15:04 np0005538960 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Initrd Default Target.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Basic System.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Initrd Root Device.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Initrd /usr File System.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Path Units.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Remote File Systems.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Preparation for Remote File Systems.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Slice Units.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Socket Units.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target System Initialization.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Local File Systems.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Swaps.
Nov 28 10:15:04 np0005538960 systemd[1]: dracut-mount.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped dracut mount hook.
Nov 28 10:15:04 np0005538960 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped dracut pre-mount hook.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped target Local Encrypted Volumes.
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Nov 28 10:15:04 np0005538960 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped dracut initqueue hook.
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Apply Kernel Variables.
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Create Volatile Files and Directories.
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Coldplug All udev Devices.
Nov 28 10:15:04 np0005538960 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped dracut pre-trigger hook.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Setup Virtual Console.
Nov 28 10:15:04 np0005538960 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-udevd.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Closed udev Control Socket.
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Closed udev Kernel Socket.
Nov 28 10:15:04 np0005538960 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped dracut pre-udev hook.
Nov 28 10:15:04 np0005538960 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped dracut cmdline hook.
Nov 28 10:15:04 np0005538960 systemd[1]: Starting Cleanup udev Database...
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Create Static Device Nodes in /dev.
Nov 28 10:15:04 np0005538960 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Create List of Static Device Nodes.
Nov 28 10:15:04 np0005538960 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Stopped Create System Users.
Nov 28 10:15:04 np0005538960 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Nov 28 10:15:04 np0005538960 systemd[1]: Finished Cleanup udev Database.
Nov 28 10:15:04 np0005538960 systemd[1]: Reached target Switch Root.
Nov 28 10:15:04 np0005538960 systemd[1]: Starting Switch Root...
Nov 28 10:15:04 np0005538960 systemd[1]: Switching root.
Nov 28 10:15:04 np0005538960 systemd-journald[306]: Journal stopped
Nov 28 10:15:05 np0005538960 systemd-journald: Received SIGTERM from PID 1 (systemd).
Nov 28 10:15:05 np0005538960 kernel: audit: type=1404 audit(1764342904.528:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Nov 28 10:15:05 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 10:15:05 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 10:15:05 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 10:15:05 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 10:15:05 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 10:15:05 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 10:15:05 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 10:15:05 np0005538960 kernel: audit: type=1403 audit(1764342904.665:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Nov 28 10:15:05 np0005538960 systemd: Successfully loaded SELinux policy in 139.948ms.
Nov 28 10:15:05 np0005538960 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 25.658ms.
Nov 28 10:15:05 np0005538960 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Nov 28 10:15:05 np0005538960 systemd: Detected virtualization kvm.
Nov 28 10:15:05 np0005538960 systemd: Detected architecture x86-64.
Nov 28 10:15:05 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:15:05 np0005538960 systemd: initrd-switch-root.service: Deactivated successfully.
Nov 28 10:15:05 np0005538960 systemd: Stopped Switch Root.
Nov 28 10:15:05 np0005538960 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Nov 28 10:15:05 np0005538960 systemd: Created slice Slice /system/getty.
Nov 28 10:15:05 np0005538960 systemd: Created slice Slice /system/serial-getty.
Nov 28 10:15:05 np0005538960 systemd: Created slice Slice /system/sshd-keygen.
Nov 28 10:15:05 np0005538960 systemd: Created slice User and Session Slice.
Nov 28 10:15:05 np0005538960 systemd: Started Dispatch Password Requests to Console Directory Watch.
Nov 28 10:15:05 np0005538960 systemd: Started Forward Password Requests to Wall Directory Watch.
Nov 28 10:15:05 np0005538960 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Nov 28 10:15:05 np0005538960 systemd: Reached target Local Encrypted Volumes.
Nov 28 10:15:05 np0005538960 systemd: Stopped target Switch Root.
Nov 28 10:15:05 np0005538960 systemd: Stopped target Initrd File Systems.
Nov 28 10:15:05 np0005538960 systemd: Stopped target Initrd Root File System.
Nov 28 10:15:05 np0005538960 systemd: Reached target Local Integrity Protected Volumes.
Nov 28 10:15:05 np0005538960 systemd: Reached target Path Units.
Nov 28 10:15:05 np0005538960 systemd: Reached target rpc_pipefs.target.
Nov 28 10:15:05 np0005538960 systemd: Reached target Slice Units.
Nov 28 10:15:05 np0005538960 systemd: Reached target Swaps.
Nov 28 10:15:05 np0005538960 systemd: Reached target Local Verity Protected Volumes.
Nov 28 10:15:05 np0005538960 systemd: Listening on RPCbind Server Activation Socket.
Nov 28 10:15:05 np0005538960 systemd: Reached target RPC Port Mapper.
Nov 28 10:15:05 np0005538960 systemd: Listening on Process Core Dump Socket.
Nov 28 10:15:05 np0005538960 systemd: Listening on initctl Compatibility Named Pipe.
Nov 28 10:15:05 np0005538960 systemd: Listening on udev Control Socket.
Nov 28 10:15:05 np0005538960 systemd: Listening on udev Kernel Socket.
Nov 28 10:15:05 np0005538960 systemd: Mounting Huge Pages File System...
Nov 28 10:15:05 np0005538960 systemd: Mounting POSIX Message Queue File System...
Nov 28 10:15:05 np0005538960 systemd: Mounting Kernel Debug File System...
Nov 28 10:15:05 np0005538960 systemd: Mounting Kernel Trace File System...
Nov 28 10:15:05 np0005538960 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 10:15:05 np0005538960 systemd: Starting Create List of Static Device Nodes...
Nov 28 10:15:05 np0005538960 systemd: Starting Load Kernel Module configfs...
Nov 28 10:15:05 np0005538960 systemd: Starting Load Kernel Module drm...
Nov 28 10:15:05 np0005538960 systemd: Starting Load Kernel Module efi_pstore...
Nov 28 10:15:05 np0005538960 systemd: Starting Load Kernel Module fuse...
Nov 28 10:15:05 np0005538960 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Nov 28 10:15:05 np0005538960 systemd: systemd-fsck-root.service: Deactivated successfully.
Nov 28 10:15:05 np0005538960 systemd: Stopped File System Check on Root Device.
Nov 28 10:15:05 np0005538960 systemd: Stopped Journal Service.
Nov 28 10:15:05 np0005538960 systemd: Starting Journal Service...
Nov 28 10:15:05 np0005538960 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Nov 28 10:15:05 np0005538960 systemd: Starting Generate network units from Kernel command line...
Nov 28 10:15:05 np0005538960 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 10:15:05 np0005538960 systemd: Starting Remount Root and Kernel File Systems...
Nov 28 10:15:05 np0005538960 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Nov 28 10:15:05 np0005538960 systemd: Starting Apply Kernel Variables...
Nov 28 10:15:05 np0005538960 kernel: fuse: init (API version 7.37)
Nov 28 10:15:05 np0005538960 systemd: Starting Coldplug All udev Devices...
Nov 28 10:15:05 np0005538960 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Nov 28 10:15:05 np0005538960 systemd: Mounted Huge Pages File System.
Nov 28 10:15:05 np0005538960 systemd: Mounted POSIX Message Queue File System.
Nov 28 10:15:05 np0005538960 systemd: Mounted Kernel Debug File System.
Nov 28 10:15:05 np0005538960 systemd: Mounted Kernel Trace File System.
Nov 28 10:15:05 np0005538960 systemd: Finished Create List of Static Device Nodes.
Nov 28 10:15:05 np0005538960 systemd: modprobe@configfs.service: Deactivated successfully.
Nov 28 10:15:05 np0005538960 systemd: Finished Load Kernel Module configfs.
Nov 28 10:15:05 np0005538960 systemd-journald[682]: Journal started
Nov 28 10:15:05 np0005538960 systemd-journald[682]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 28 10:15:05 np0005538960 systemd[1]: Queued start job for default target Multi-User System.
Nov 28 10:15:05 np0005538960 systemd[1]: systemd-journald.service: Deactivated successfully.
Nov 28 10:15:05 np0005538960 systemd: Started Journal Service.
Nov 28 10:15:05 np0005538960 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Load Kernel Module efi_pstore.
Nov 28 10:15:05 np0005538960 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Load Kernel Module fuse.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Nov 28 10:15:05 np0005538960 kernel: ACPI: bus type drm_connector registered
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Generate network units from Kernel command line.
Nov 28 10:15:05 np0005538960 systemd[1]: modprobe@drm.service: Deactivated successfully.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Load Kernel Module drm.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Remount Root and Kernel File Systems.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Apply Kernel Variables.
Nov 28 10:15:05 np0005538960 systemd[1]: Mounting FUSE Control File System...
Nov 28 10:15:05 np0005538960 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Rebuild Hardware Database...
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Flush Journal to Persistent Storage...
Nov 28 10:15:05 np0005538960 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Load/Save OS Random Seed...
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Create System Users...
Nov 28 10:15:05 np0005538960 systemd[1]: Mounted FUSE Control File System.
Nov 28 10:15:05 np0005538960 systemd-journald[682]: Runtime Journal (/run/log/journal/1f988c78c563e12389ab342aced42dbb) is 8.0M, max 153.6M, 145.6M free.
Nov 28 10:15:05 np0005538960 systemd-journald[682]: Received client request to flush runtime journal.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Flush Journal to Persistent Storage.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Load/Save OS Random Seed.
Nov 28 10:15:05 np0005538960 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Coldplug All udev Devices.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Create System Users.
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Create Static Device Nodes in /dev...
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Create Static Device Nodes in /dev.
Nov 28 10:15:05 np0005538960 systemd[1]: Reached target Preparation for Local File Systems.
Nov 28 10:15:05 np0005538960 systemd[1]: Reached target Local File Systems.
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Nov 28 10:15:05 np0005538960 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Nov 28 10:15:05 np0005538960 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Nov 28 10:15:05 np0005538960 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Automatic Boot Loader Update...
Nov 28 10:15:05 np0005538960 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Create Volatile Files and Directories...
Nov 28 10:15:05 np0005538960 bootctl[699]: Couldn't find EFI system partition, skipping.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Automatic Boot Loader Update.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Create Volatile Files and Directories.
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Security Auditing Service...
Nov 28 10:15:05 np0005538960 systemd[1]: Starting RPC Bind...
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Rebuild Journal Catalog...
Nov 28 10:15:05 np0005538960 auditd[705]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Nov 28 10:15:05 np0005538960 auditd[705]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Nov 28 10:15:05 np0005538960 systemd[1]: Started RPC Bind.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Rebuild Journal Catalog.
Nov 28 10:15:05 np0005538960 augenrules[710]: /sbin/augenrules: No change
Nov 28 10:15:05 np0005538960 augenrules[725]: No rules
Nov 28 10:15:05 np0005538960 augenrules[725]: enabled 1
Nov 28 10:15:05 np0005538960 augenrules[725]: failure 1
Nov 28 10:15:05 np0005538960 augenrules[725]: pid 705
Nov 28 10:15:05 np0005538960 augenrules[725]: rate_limit 0
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_limit 8192
Nov 28 10:15:05 np0005538960 augenrules[725]: lost 0
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog 2
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_wait_time 60000
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_wait_time_actual 0
Nov 28 10:15:05 np0005538960 augenrules[725]: enabled 1
Nov 28 10:15:05 np0005538960 augenrules[725]: failure 1
Nov 28 10:15:05 np0005538960 augenrules[725]: pid 705
Nov 28 10:15:05 np0005538960 augenrules[725]: rate_limit 0
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_limit 8192
Nov 28 10:15:05 np0005538960 augenrules[725]: lost 0
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog 0
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_wait_time 60000
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_wait_time_actual 0
Nov 28 10:15:05 np0005538960 augenrules[725]: enabled 1
Nov 28 10:15:05 np0005538960 augenrules[725]: failure 1
Nov 28 10:15:05 np0005538960 augenrules[725]: pid 705
Nov 28 10:15:05 np0005538960 augenrules[725]: rate_limit 0
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_limit 8192
Nov 28 10:15:05 np0005538960 augenrules[725]: lost 0
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog 1
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_wait_time 60000
Nov 28 10:15:05 np0005538960 augenrules[725]: backlog_wait_time_actual 0
Nov 28 10:15:05 np0005538960 systemd[1]: Started Security Auditing Service.
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Rebuild Hardware Database.
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Update is Completed...
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Update is Completed.
Nov 28 10:15:05 np0005538960 systemd-udevd[733]: Using default interface naming scheme 'rhel-9.0'.
Nov 28 10:15:05 np0005538960 systemd[1]: Started Rule-based Manager for Device Events and Files.
Nov 28 10:15:05 np0005538960 systemd[1]: Reached target System Initialization.
Nov 28 10:15:05 np0005538960 systemd[1]: Started dnf makecache --timer.
Nov 28 10:15:05 np0005538960 systemd[1]: Started Daily rotation of log files.
Nov 28 10:15:05 np0005538960 systemd[1]: Started Daily Cleanup of Temporary Directories.
Nov 28 10:15:05 np0005538960 systemd[1]: Reached target Timer Units.
Nov 28 10:15:05 np0005538960 systemd[1]: Listening on D-Bus System Message Bus Socket.
Nov 28 10:15:05 np0005538960 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Nov 28 10:15:05 np0005538960 systemd[1]: Reached target Socket Units.
Nov 28 10:15:05 np0005538960 systemd[1]: Starting D-Bus System Message Bus...
Nov 28 10:15:05 np0005538960 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 10:15:05 np0005538960 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Load Kernel Module configfs...
Nov 28 10:15:05 np0005538960 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Nov 28 10:15:05 np0005538960 systemd[1]: Finished Load Kernel Module configfs.
Nov 28 10:15:05 np0005538960 systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:15:05 np0005538960 systemd[1]: Started D-Bus System Message Bus.
Nov 28 10:15:05 np0005538960 systemd[1]: Reached target Basic System.
Nov 28 10:15:05 np0005538960 dbus-broker-lau[755]: Ready
Nov 28 10:15:05 np0005538960 systemd[1]: Starting NTP client/server...
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Nov 28 10:15:05 np0005538960 systemd[1]: Starting Restore /run/initramfs on shutdown...
Nov 28 10:15:05 np0005538960 systemd[1]: Starting IPv4 firewall with iptables...
Nov 28 10:15:05 np0005538960 systemd[1]: Started irqbalance daemon.
Nov 28 10:15:05 np0005538960 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Nov 28 10:15:05 np0005538960 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 10:15:05 np0005538960 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 10:15:05 np0005538960 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 10:15:05 np0005538960 systemd[1]: Reached target sshd-keygen.target.
Nov 28 10:15:05 np0005538960 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Nov 28 10:15:05 np0005538960 systemd[1]: Reached target User and Group Name Lookups.
Nov 28 10:15:05 np0005538960 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Nov 28 10:15:05 np0005538960 systemd[1]: Starting User Login Management...
Nov 28 10:15:06 np0005538960 systemd[1]: Finished Restore /run/initramfs on shutdown.
Nov 28 10:15:06 np0005538960 chronyd[797]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 10:15:06 np0005538960 chronyd[797]: Loaded 0 symmetric keys
Nov 28 10:15:06 np0005538960 chronyd[797]: Using right/UTC timezone to obtain leap second data
Nov 28 10:15:06 np0005538960 chronyd[797]: Loaded seccomp filter (level 2)
Nov 28 10:15:06 np0005538960 systemd[1]: Started NTP client/server.
Nov 28 10:15:06 np0005538960 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 10:15:06 np0005538960 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 10:15:06 np0005538960 systemd-logind[788]: New seat seat0.
Nov 28 10:15:06 np0005538960 systemd[1]: Started User Login Management.
Nov 28 10:15:06 np0005538960 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Nov 28 10:15:06 np0005538960 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Nov 28 10:15:06 np0005538960 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Nov 28 10:15:06 np0005538960 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Nov 28 10:15:06 np0005538960 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Nov 28 10:15:06 np0005538960 kernel: kvm_amd: TSC scaling supported
Nov 28 10:15:06 np0005538960 kernel: kvm_amd: Nested Virtualization enabled
Nov 28 10:15:06 np0005538960 kernel: kvm_amd: Nested Paging enabled
Nov 28 10:15:06 np0005538960 kernel: kvm_amd: LBR virtualization supported
Nov 28 10:15:06 np0005538960 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Nov 28 10:15:06 np0005538960 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Nov 28 10:15:06 np0005538960 kernel: Console: switching to colour dummy device 80x25
Nov 28 10:15:06 np0005538960 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Nov 28 10:15:06 np0005538960 kernel: [drm] features: -context_init
Nov 28 10:15:06 np0005538960 kernel: [drm] number of scanouts: 1
Nov 28 10:15:06 np0005538960 kernel: [drm] number of cap sets: 0
Nov 28 10:15:06 np0005538960 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Nov 28 10:15:06 np0005538960 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Nov 28 10:15:06 np0005538960 kernel: Console: switching to colour frame buffer device 128x48
Nov 28 10:15:06 np0005538960 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Nov 28 10:15:06 np0005538960 iptables.init[781]: iptables: Applying firewall rules: [  OK  ]
Nov 28 10:15:06 np0005538960 systemd[1]: Finished IPv4 firewall with iptables.
Nov 28 10:15:06 np0005538960 cloud-init[842]: Cloud-init v. 24.4-7.el9 running 'init-local' at Fri, 28 Nov 2025 15:15:06 +0000. Up 6.05 seconds.
Nov 28 10:15:06 np0005538960 systemd[1]: run-cloud\x2dinit-tmp-tmpz8v3z6om.mount: Deactivated successfully.
Nov 28 10:15:06 np0005538960 systemd[1]: Starting Hostname Service...
Nov 28 10:15:06 np0005538960 systemd[1]: Started Hostname Service.
Nov 28 10:15:06 np0005538960 systemd-hostnamed[856]: Hostname set to <np0005538960.novalocal> (static)
Nov 28 10:15:06 np0005538960 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Nov 28 10:15:06 np0005538960 systemd[1]: Reached target Preparation for Network.
Nov 28 10:15:06 np0005538960 systemd[1]: Starting Network Manager...
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9364] NetworkManager (version 1.54.1-1.el9) is starting... (boot:173012fb-6f90-4e2d-8a88-62d31adeba03)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9369] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9468] manager[0x55580b32c080]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9507] hostname: hostname: using hostnamed
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9507] hostname: static hostname changed from (none) to "np0005538960.novalocal"
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9512] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9636] manager[0x55580b32c080]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9637] manager[0x55580b32c080]: rfkill: WWAN hardware radio set enabled
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9685] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9686] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9687] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 10:15:06 np0005538960 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9688] manager: Networking is enabled by state file
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9693] settings: Loaded settings plugin: keyfile (internal)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9707] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9736] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9751] dhcp: init: Using DHCP client 'internal'
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9754] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9769] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9777] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9786] device (lo): Activation: starting connection 'lo' (4b798dc0-9e3b-4d95-858d-136ef1166398)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9800] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9804] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9846] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9852] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9855] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9858] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9861] device (eth0): carrier: link connected
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9866] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 10:15:06 np0005538960 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9873] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9887] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9892] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9893] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9895] manager: NetworkManager state is now CONNECTING
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9896] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:15:06 np0005538960 systemd[1]: Started Network Manager.
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9904] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:15:06 np0005538960 NetworkManager[860]: <info>  [1764342906.9907] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:15:06 np0005538960 systemd[1]: Reached target Network.
Nov 28 10:15:06 np0005538960 systemd[1]: Starting Network Manager Wait Online...
Nov 28 10:15:07 np0005538960 systemd[1]: Starting GSSAPI Proxy Daemon...
Nov 28 10:15:07 np0005538960 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.0114] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.0120] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.0130] device (lo): Activation: successful, device activated.
Nov 28 10:15:07 np0005538960 systemd[1]: Started GSSAPI Proxy Daemon.
Nov 28 10:15:07 np0005538960 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Nov 28 10:15:07 np0005538960 systemd[1]: Reached target NFS client services.
Nov 28 10:15:07 np0005538960 systemd[1]: Reached target Preparation for Remote File Systems.
Nov 28 10:15:07 np0005538960 systemd[1]: Reached target Remote File Systems.
Nov 28 10:15:07 np0005538960 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6718] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6733] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6760] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6813] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6815] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6821] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6824] device (eth0): Activation: successful, device activated.
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6830] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 10:15:07 np0005538960 NetworkManager[860]: <info>  [1764342907.6832] manager: startup complete
Nov 28 10:15:07 np0005538960 systemd[1]: Finished Network Manager Wait Online.
Nov 28 10:15:07 np0005538960 systemd[1]: Starting Cloud-init: Network Stage...
Nov 28 10:15:08 np0005538960 cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Fri, 28 Nov 2025 15:15:07 +0000. Up 7.63 seconds.
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |  eth0  | True |        38.102.83.204         | 255.255.255.0 | global | fa:16:3e:fe:75:85 |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fefe:7585/64 |       .       |  link  | fa:16:3e:fe:75:85 |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Nov 28 10:15:08 np0005538960 cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Nov 28 10:15:09 np0005538960 cloud-init[923]: Generating public/private rsa key pair.
Nov 28 10:15:09 np0005538960 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Nov 28 10:15:09 np0005538960 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Nov 28 10:15:09 np0005538960 cloud-init[923]: The key fingerprint is:
Nov 28 10:15:09 np0005538960 cloud-init[923]: SHA256:CT8nmVtSIGKDOXvfYHHbSV/D4hPVT3119fmsXsA0Niw root@np0005538960.novalocal
Nov 28 10:15:09 np0005538960 cloud-init[923]: The key's randomart image is:
Nov 28 10:15:09 np0005538960 cloud-init[923]: +---[RSA 3072]----+
Nov 28 10:15:09 np0005538960 cloud-init[923]: |   o+ . .    o..B|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |  +. o..... o.+ O|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |   o  .o +.+E+*++|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |  . . oo.=o += +o|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |   . o oS o  .o o|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |      . .B     o |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |        .     . .|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |             . . |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |              .  |
Nov 28 10:15:09 np0005538960 cloud-init[923]: +----[SHA256]-----+
Nov 28 10:15:09 np0005538960 cloud-init[923]: Generating public/private ecdsa key pair.
Nov 28 10:15:09 np0005538960 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Nov 28 10:15:09 np0005538960 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Nov 28 10:15:09 np0005538960 cloud-init[923]: The key fingerprint is:
Nov 28 10:15:09 np0005538960 cloud-init[923]: SHA256:nBuA0etdQOhBLn+yoZRYk+EJOtg1vIPTYMoVegmip3Y root@np0005538960.novalocal
Nov 28 10:15:09 np0005538960 cloud-init[923]: The key's randomart image is:
Nov 28 10:15:09 np0005538960 cloud-init[923]: +---[ECDSA 256]---+
Nov 28 10:15:09 np0005538960 cloud-init[923]: |...o*o.o.        |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |+o==oO+ .        |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |*++*Oooo .       |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |.=+o+=oo ..      |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |...Eoo+.S.       |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |. .. ..=.o       |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |    . . .        |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |                 |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |                 |
Nov 28 10:15:09 np0005538960 cloud-init[923]: +----[SHA256]-----+
Nov 28 10:15:09 np0005538960 cloud-init[923]: Generating public/private ed25519 key pair.
Nov 28 10:15:09 np0005538960 cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Nov 28 10:15:09 np0005538960 cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Nov 28 10:15:09 np0005538960 cloud-init[923]: The key fingerprint is:
Nov 28 10:15:09 np0005538960 cloud-init[923]: SHA256:bZxsT6EkgIVl9qyGbUcrV6w+P0Nc02FzNvXzNVudpNM root@np0005538960.novalocal
Nov 28 10:15:09 np0005538960 cloud-init[923]: The key's randomart image is:
Nov 28 10:15:09 np0005538960 cloud-init[923]: +--[ED25519 256]--+
Nov 28 10:15:09 np0005538960 cloud-init[923]: |     =*        .o|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |    oo + .    B.*|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |        = + .= E*|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |     o o X oo.o B|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |    . * S.O... ..|
Nov 28 10:15:09 np0005538960 cloud-init[923]: |     o = ooo     |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |        o.  .    |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |         oo      |
Nov 28 10:15:09 np0005538960 cloud-init[923]: |          .o     |
Nov 28 10:15:09 np0005538960 cloud-init[923]: +----[SHA256]-----+
Nov 28 10:15:09 np0005538960 systemd[1]: Finished Cloud-init: Network Stage.
Nov 28 10:15:09 np0005538960 systemd[1]: Reached target Cloud-config availability.
Nov 28 10:15:09 np0005538960 systemd[1]: Reached target Network is Online.
Nov 28 10:15:09 np0005538960 systemd[1]: Starting Cloud-init: Config Stage...
Nov 28 10:15:09 np0005538960 systemd[1]: Starting Crash recovery kernel arming...
Nov 28 10:15:09 np0005538960 systemd[1]: Starting Notify NFS peers of a restart...
Nov 28 10:15:09 np0005538960 systemd[1]: Starting System Logging Service...
Nov 28 10:15:09 np0005538960 systemd[1]: Starting OpenSSH server daemon...
Nov 28 10:15:09 np0005538960 systemd[1]: Starting Permit User Sessions...
Nov 28 10:15:09 np0005538960 sm-notify[1007]: Version 2.5.4 starting
Nov 28 10:15:09 np0005538960 systemd[1]: Started Notify NFS peers of a restart.
Nov 28 10:15:09 np0005538960 systemd[1]: Started OpenSSH server daemon.
Nov 28 10:15:09 np0005538960 systemd[1]: Finished Permit User Sessions.
Nov 28 10:15:09 np0005538960 systemd[1]: Started Command Scheduler.
Nov 28 10:15:09 np0005538960 systemd[1]: Started Getty on tty1.
Nov 28 10:15:09 np0005538960 systemd[1]: Started Serial Getty on ttyS0.
Nov 28 10:15:09 np0005538960 systemd[1]: Reached target Login Prompts.
Nov 28 10:15:09 np0005538960 systemd[1]: Started System Logging Service.
Nov 28 10:15:09 np0005538960 rsyslogd[1008]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1008" x-info="https://www.rsyslog.com"] start
Nov 28 10:15:09 np0005538960 rsyslogd[1008]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Nov 28 10:15:09 np0005538960 systemd[1]: Reached target Multi-User System.
Nov 28 10:15:09 np0005538960 systemd[1]: Starting Record Runlevel Change in UTMP...
Nov 28 10:15:09 np0005538960 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Nov 28 10:15:09 np0005538960 systemd[1]: Finished Record Runlevel Change in UTMP.
Nov 28 10:15:09 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:15:09 np0005538960 kdumpctl[1017]: kdump: No kdump initial ramdisk found.
Nov 28 10:15:09 np0005538960 kdumpctl[1017]: kdump: Rebuilding /boot/initramfs-5.14.0-642.el9.x86_64kdump.img
Nov 28 10:15:09 np0005538960 cloud-init[1140]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Fri, 28 Nov 2025 15:15:09 +0000. Up 9.27 seconds.
Nov 28 10:15:09 np0005538960 systemd[1]: Finished Cloud-init: Config Stage.
Nov 28 10:15:09 np0005538960 systemd[1]: Starting Cloud-init: Final Stage...
Nov 28 10:15:09 np0005538960 dracut[1268]: dracut-057-102.git20250818.el9
Nov 28 10:15:10 np0005538960 cloud-init[1290]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Fri, 28 Nov 2025 15:15:10 +0000. Up 9.71 seconds.
Nov 28 10:15:10 np0005538960 dracut[1270]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/b277050f-8ace-464d-abb6-4c46d4c45253 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-642.el9.x86_64kdump.img 5.14.0-642.el9.x86_64
Nov 28 10:15:10 np0005538960 cloud-init[1312]: #############################################################
Nov 28 10:15:10 np0005538960 cloud-init[1316]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Nov 28 10:15:10 np0005538960 cloud-init[1323]: 256 SHA256:nBuA0etdQOhBLn+yoZRYk+EJOtg1vIPTYMoVegmip3Y root@np0005538960.novalocal (ECDSA)
Nov 28 10:15:10 np0005538960 cloud-init[1330]: 256 SHA256:bZxsT6EkgIVl9qyGbUcrV6w+P0Nc02FzNvXzNVudpNM root@np0005538960.novalocal (ED25519)
Nov 28 10:15:10 np0005538960 cloud-init[1338]: 3072 SHA256:CT8nmVtSIGKDOXvfYHHbSV/D4hPVT3119fmsXsA0Niw root@np0005538960.novalocal (RSA)
Nov 28 10:15:10 np0005538960 cloud-init[1341]: -----END SSH HOST KEY FINGERPRINTS-----
Nov 28 10:15:10 np0005538960 cloud-init[1343]: #############################################################
Nov 28 10:15:10 np0005538960 cloud-init[1290]: Cloud-init v. 24.4-7.el9 finished at Fri, 28 Nov 2025 15:15:10 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.88 seconds
Nov 28 10:15:10 np0005538960 systemd[1]: Finished Cloud-init: Final Stage.
Nov 28 10:15:10 np0005538960 systemd[1]: Reached target Cloud-init target.
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 10:15:10 np0005538960 dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: memstrack is not available
Nov 28 10:15:11 np0005538960 dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Nov 28 10:15:11 np0005538960 dracut[1270]: memstrack is not available
Nov 28 10:15:11 np0005538960 dracut[1270]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Nov 28 10:15:11 np0005538960 dracut[1270]: *** Including module: systemd ***
Nov 28 10:15:12 np0005538960 dracut[1270]: *** Including module: fips ***
Nov 28 10:15:12 np0005538960 chronyd[797]: Selected source 174.138.193.90 (2.centos.pool.ntp.org)
Nov 28 10:15:12 np0005538960 chronyd[797]: System clock TAI offset set to 37 seconds
Nov 28 10:15:12 np0005538960 dracut[1270]: *** Including module: systemd-initrd ***
Nov 28 10:15:12 np0005538960 dracut[1270]: *** Including module: i18n ***
Nov 28 10:15:12 np0005538960 dracut[1270]: *** Including module: drm ***
Nov 28 10:15:13 np0005538960 dracut[1270]: *** Including module: prefixdevname ***
Nov 28 10:15:13 np0005538960 dracut[1270]: *** Including module: kernel-modules ***
Nov 28 10:15:13 np0005538960 kernel: block vda: the capability attribute has been deprecated.
Nov 28 10:15:14 np0005538960 dracut[1270]: *** Including module: kernel-modules-extra ***
Nov 28 10:15:14 np0005538960 dracut[1270]: *** Including module: qemu ***
Nov 28 10:15:14 np0005538960 dracut[1270]: *** Including module: fstab-sys ***
Nov 28 10:15:14 np0005538960 dracut[1270]: *** Including module: rootfs-block ***
Nov 28 10:15:14 np0005538960 dracut[1270]: *** Including module: terminfo ***
Nov 28 10:15:14 np0005538960 dracut[1270]: *** Including module: udev-rules ***
Nov 28 10:15:15 np0005538960 dracut[1270]: Skipping udev rule: 91-permissions.rules
Nov 28 10:15:15 np0005538960 dracut[1270]: Skipping udev rule: 80-drivers-modprobe.rules
Nov 28 10:15:15 np0005538960 dracut[1270]: *** Including module: virtiofs ***
Nov 28 10:15:15 np0005538960 dracut[1270]: *** Including module: dracut-systemd ***
Nov 28 10:15:15 np0005538960 dracut[1270]: *** Including module: usrmount ***
Nov 28 10:15:15 np0005538960 dracut[1270]: *** Including module: base ***
Nov 28 10:15:15 np0005538960 dracut[1270]: *** Including module: fs-lib ***
Nov 28 10:15:15 np0005538960 dracut[1270]: *** Including module: kdumpbase ***
Nov 28 10:15:15 np0005538960 dracut[1270]: *** Including module: microcode_ctl-fw_dir_override ***
Nov 28 10:15:15 np0005538960 dracut[1270]:  microcode_ctl module: mangling fw_dir
Nov 28 10:15:15 np0005538960 dracut[1270]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Nov 28 10:15:15 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Nov 28 10:15:16 np0005538960 irqbalance[782]: Cannot change IRQ 25 affinity: Operation not permitted
Nov 28 10:15:16 np0005538960 irqbalance[782]: IRQ 25 affinity is now unmanaged
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Nov 28 10:15:16 np0005538960 irqbalance[782]: Cannot change IRQ 31 affinity: Operation not permitted
Nov 28 10:15:16 np0005538960 irqbalance[782]: IRQ 31 affinity is now unmanaged
Nov 28 10:15:16 np0005538960 irqbalance[782]: Cannot change IRQ 28 affinity: Operation not permitted
Nov 28 10:15:16 np0005538960 irqbalance[782]: IRQ 28 affinity is now unmanaged
Nov 28 10:15:16 np0005538960 irqbalance[782]: Cannot change IRQ 32 affinity: Operation not permitted
Nov 28 10:15:16 np0005538960 irqbalance[782]: IRQ 32 affinity is now unmanaged
Nov 28 10:15:16 np0005538960 irqbalance[782]: Cannot change IRQ 30 affinity: Operation not permitted
Nov 28 10:15:16 np0005538960 irqbalance[782]: IRQ 30 affinity is now unmanaged
Nov 28 10:15:16 np0005538960 irqbalance[782]: Cannot change IRQ 29 affinity: Operation not permitted
Nov 28 10:15:16 np0005538960 irqbalance[782]: IRQ 29 affinity is now unmanaged
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Nov 28 10:15:16 np0005538960 dracut[1270]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Nov 28 10:15:16 np0005538960 dracut[1270]: *** Including module: openssl ***
Nov 28 10:15:16 np0005538960 dracut[1270]: *** Including module: shutdown ***
Nov 28 10:15:16 np0005538960 dracut[1270]: *** Including module: squash ***
Nov 28 10:15:16 np0005538960 dracut[1270]: *** Including modules done ***
Nov 28 10:15:16 np0005538960 dracut[1270]: *** Installing kernel module dependencies ***
Nov 28 10:15:17 np0005538960 dracut[1270]: *** Installing kernel module dependencies done ***
Nov 28 10:15:17 np0005538960 dracut[1270]: *** Resolving executable dependencies ***
Nov 28 10:15:17 np0005538960 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 10:15:18 np0005538960 dracut[1270]: *** Resolving executable dependencies done ***
Nov 28 10:15:18 np0005538960 dracut[1270]: *** Generating early-microcode cpio image ***
Nov 28 10:15:18 np0005538960 dracut[1270]: *** Store current command line parameters ***
Nov 28 10:15:18 np0005538960 dracut[1270]: Stored kernel commandline:
Nov 28 10:15:18 np0005538960 dracut[1270]: No dracut internal kernel commandline stored in the initramfs
Nov 28 10:15:19 np0005538960 dracut[1270]: *** Install squash loader ***
Nov 28 10:15:19 np0005538960 dracut[1270]: *** Squashing the files inside the initramfs ***
Nov 28 10:15:20 np0005538960 dracut[1270]: *** Squashing the files inside the initramfs done ***
Nov 28 10:15:20 np0005538960 dracut[1270]: *** Creating image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' ***
Nov 28 10:15:20 np0005538960 dracut[1270]: *** Hardlinking files ***
Nov 28 10:15:20 np0005538960 dracut[1270]: *** Hardlinking files done ***
Nov 28 10:15:21 np0005538960 dracut[1270]: *** Creating initramfs image file '/boot/initramfs-5.14.0-642.el9.x86_64kdump.img' done ***
Nov 28 10:15:21 np0005538960 kdumpctl[1017]: kdump: kexec: loaded kdump kernel
Nov 28 10:15:21 np0005538960 kdumpctl[1017]: kdump: Starting kdump: [OK]
Nov 28 10:15:21 np0005538960 systemd[1]: Finished Crash recovery kernel arming.
Nov 28 10:15:21 np0005538960 systemd[1]: Startup finished in 1.528s (kernel) + 2.632s (initrd) + 17.271s (userspace) = 21.433s.
Nov 28 10:15:36 np0005538960 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 10:19:47 np0005538960 systemd[1]: Created slice User Slice of UID 1000.
Nov 28 10:19:47 np0005538960 systemd[1]: Starting User Runtime Directory /run/user/1000...
Nov 28 10:19:47 np0005538960 systemd-logind[788]: New session 1 of user zuul.
Nov 28 10:19:47 np0005538960 systemd[1]: Finished User Runtime Directory /run/user/1000.
Nov 28 10:19:47 np0005538960 systemd[1]: Starting User Manager for UID 1000...
Nov 28 10:19:47 np0005538960 systemd[4323]: Queued start job for default target Main User Target.
Nov 28 10:19:47 np0005538960 systemd[4323]: Created slice User Application Slice.
Nov 28 10:19:47 np0005538960 systemd[4323]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 10:19:47 np0005538960 systemd[4323]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 10:19:47 np0005538960 systemd[4323]: Reached target Paths.
Nov 28 10:19:47 np0005538960 systemd[4323]: Reached target Timers.
Nov 28 10:19:47 np0005538960 systemd[4323]: Starting D-Bus User Message Bus Socket...
Nov 28 10:19:47 np0005538960 systemd[4323]: Starting Create User's Volatile Files and Directories...
Nov 28 10:19:47 np0005538960 systemd[4323]: Finished Create User's Volatile Files and Directories.
Nov 28 10:19:47 np0005538960 systemd[4323]: Listening on D-Bus User Message Bus Socket.
Nov 28 10:19:47 np0005538960 systemd[4323]: Reached target Sockets.
Nov 28 10:19:47 np0005538960 systemd[4323]: Reached target Basic System.
Nov 28 10:19:47 np0005538960 systemd[4323]: Reached target Main User Target.
Nov 28 10:19:47 np0005538960 systemd[4323]: Startup finished in 115ms.
Nov 28 10:19:47 np0005538960 systemd[1]: Started User Manager for UID 1000.
Nov 28 10:19:47 np0005538960 systemd[1]: Started Session 1 of User zuul.
Nov 28 10:19:48 np0005538960 python3[4405]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:19:51 np0005538960 python3[4433]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:19:58 np0005538960 python3[4491]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:19:59 np0005538960 python3[4531]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Nov 28 10:20:02 np0005538960 python3[4557]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCam19m0ksqTRDTC1k8X1Qq09nXp1quPdWEBRLDymaPg+OoNBplvoS0slXgHnynuAsRjUnvt1wNNOUyqWeRtVDDEO6f4X+UJhhyTMi5pe1scPIWzNIBU01sYPabP2lruL4HsjFJPCXkXyIIHAJVGLt1uCCRLshGSenp0eSugBGimOnccDYkswksSDtJVWZ0YI3G8ETTMpcFm9t9yV014VZq9HhU5ruyUbcPiWJa7uoBVWSHDWNKBknTRTNp/+iqeuxlnkKesATN3ZqnOqUFTjBatiKml0S+1jVwS6ahVUzMskMVwu6OpRXBOjWxI/EbjRyiltjoygna+kmgsR7+IhtxKcsNvwGh+8ByXOf0FK+UOKhnecPct+eMQNm4qjm5956VT18pQi+MDbg1lwHsS9iy88SA65jM9h3XSvQQkM5B3G6tM/ycSUBdG8huAWQaRrrma3L0RK0FDThfuJLk9AylFfFGvf8nyn+/HTPQMPpApUZdO/9btaIQYvgNQL+v/oc= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:02 np0005538960 python3[4581]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:03 np0005538960 python3[4680]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:20:03 np0005538960 python3[4751]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764343202.7911186-252-231660499641177/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6bb528e740df473d8906787853b35f46_id_rsa follow=False checksum=639acf7a65ba71b7629a08e0ffec5801e0978ed5 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:04 np0005538960 python3[4874]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:20:04 np0005538960 python3[4945]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764343203.7840197-307-94927954158549/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6bb528e740df473d8906787853b35f46_id_rsa.pub follow=False checksum=847027b3483a69cb2ce3830044674fbb959797e8 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:05 np0005538960 python3[4993]: ansible-ping Invoked with data=pong
Nov 28 10:20:06 np0005538960 python3[5017]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:20:10 np0005538960 python3[5075]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Nov 28 10:20:11 np0005538960 python3[5107]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:11 np0005538960 python3[5131]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:11 np0005538960 python3[5155]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:12 np0005538960 python3[5179]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:12 np0005538960 python3[5203]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:12 np0005538960 python3[5227]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:14 np0005538960 python3[5253]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:16 np0005538960 python3[5331]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:20:16 np0005538960 python3[5404]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764343215.7843688-32-90389578083108/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:17 np0005538960 python3[5452]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:17 np0005538960 python3[5476]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:17 np0005538960 python3[5500]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:18 np0005538960 python3[5524]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:18 np0005538960 python3[5548]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:19 np0005538960 python3[5572]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:19 np0005538960 python3[5596]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:19 np0005538960 python3[5620]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:20 np0005538960 python3[5644]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:20 np0005538960 python3[5668]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:20 np0005538960 python3[5692]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:20 np0005538960 python3[5716]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:21 np0005538960 python3[5740]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:21 np0005538960 python3[5764]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:21 np0005538960 python3[5788]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:22 np0005538960 python3[5812]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:23 np0005538960 python3[5836]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:23 np0005538960 python3[5860]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:23 np0005538960 python3[5884]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:24 np0005538960 python3[5908]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:24 np0005538960 python3[5932]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:24 np0005538960 python3[5956]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:24 np0005538960 python3[5980]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:25 np0005538960 python3[6004]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:25 np0005538960 python3[6028]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:25 np0005538960 python3[6052]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:20:27 np0005538960 python3[6078]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 10:20:27 np0005538960 systemd[1]: Starting Time & Date Service...
Nov 28 10:20:27 np0005538960 systemd[1]: Started Time & Date Service.
Nov 28 10:20:27 np0005538960 systemd-timedated[6080]: Changed time zone to 'UTC' (UTC).
Nov 28 10:20:28 np0005538960 python3[6109]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:28 np0005538960 python3[6187]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:20:28 np0005538960 python3[6258]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1764343228.3861911-252-185458213135526/source _original_basename=tmpbdpu7jjv follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:29 np0005538960 python3[6358]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:20:29 np0005538960 python3[6429]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764343229.3498566-303-154650492796355/source _original_basename=tmp9tk1iy5e follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:30 np0005538960 python3[6531]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:20:31 np0005538960 python3[6604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1764343230.6140478-382-54281203780472/source _original_basename=tmprfw52oke follow=False checksum=01954034105cdb65b42722894a5c1036808c70c7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:31 np0005538960 python3[6652]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:20:32 np0005538960 python3[6678]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:20:32 np0005538960 python3[6758]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:20:33 np0005538960 python3[6831]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1764343232.4749079-453-115823477153394/source _original_basename=tmpqsfjp0nq follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:34 np0005538960 python3[6882]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-d88a-0f2a-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:20:34 np0005538960 python3[6910]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-d88a-0f2a-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Nov 28 10:20:36 np0005538960 python3[6938]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:54 np0005538960 python3[6964]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:20:57 np0005538960 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 10:21:54 np0005538960 systemd-logind[788]: Session 1 logged out. Waiting for processes to exit.
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Nov 28 10:21:57 np0005538960 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Nov 28 10:21:57 np0005538960 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3705] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 10:21:57 np0005538960 systemd-udevd[6970]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3914] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3943] settings: (eth1): created default wired connection 'Wired connection 1'
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3948] device (eth1): carrier: link connected
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3951] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3957] policy: auto-activating connection 'Wired connection 1' (526edb83-c117-3f66-855f-1f77ff3e3e99)
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3961] device (eth1): Activation: starting connection 'Wired connection 1' (526edb83-c117-3f66-855f-1f77ff3e3e99)
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3962] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3966] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3971] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:21:57 np0005538960 NetworkManager[860]: <info>  [1764343317.3976] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:21:57 np0005538960 systemd[4323]: Starting Mark boot as successful...
Nov 28 10:21:57 np0005538960 systemd[4323]: Finished Mark boot as successful.
Nov 28 10:21:58 np0005538960 systemd-logind[788]: New session 3 of user zuul.
Nov 28 10:21:58 np0005538960 systemd[1]: Started Session 3 of User zuul.
Nov 28 10:21:58 np0005538960 python3[7001]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-39ef-da6f-000000000189-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:22:05 np0005538960 python3[7082]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:22:05 np0005538960 python3[7155]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764343324.9917808-155-152912965358663/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=dd97b97ca1dc65128414b0667b8cdd4c2debab35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:22:06 np0005538960 python3[7205]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 10:22:06 np0005538960 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 28 10:22:06 np0005538960 systemd[1]: Stopped Network Manager Wait Online.
Nov 28 10:22:06 np0005538960 systemd[1]: Stopping Network Manager Wait Online...
Nov 28 10:22:06 np0005538960 systemd[1]: Stopping Network Manager...
Nov 28 10:22:06 np0005538960 NetworkManager[860]: <info>  [1764343326.3046] caught SIGTERM, shutting down normally.
Nov 28 10:22:06 np0005538960 NetworkManager[860]: <info>  [1764343326.3058] dhcp4 (eth0): canceled DHCP transaction
Nov 28 10:22:06 np0005538960 NetworkManager[860]: <info>  [1764343326.3058] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:22:06 np0005538960 NetworkManager[860]: <info>  [1764343326.3058] dhcp4 (eth0): state changed no lease
Nov 28 10:22:06 np0005538960 NetworkManager[860]: <info>  [1764343326.3060] manager: NetworkManager state is now CONNECTING
Nov 28 10:22:06 np0005538960 NetworkManager[860]: <info>  [1764343326.3139] dhcp4 (eth1): canceled DHCP transaction
Nov 28 10:22:06 np0005538960 NetworkManager[860]: <info>  [1764343326.3140] dhcp4 (eth1): state changed no lease
Nov 28 10:22:06 np0005538960 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 10:22:06 np0005538960 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 10:22:06 np0005538960 NetworkManager[860]: <info>  [1764343326.4141] exiting (success)
Nov 28 10:22:06 np0005538960 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 28 10:22:06 np0005538960 systemd[1]: Stopped Network Manager.
Nov 28 10:22:06 np0005538960 systemd[1]: NetworkManager.service: Consumed 3.167s CPU time, 10.0M memory peak.
Nov 28 10:22:06 np0005538960 systemd[1]: Starting Network Manager...
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.4716] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:173012fb-6f90-4e2d-8a88-62d31adeba03)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.4717] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.4765] manager[0x55bef017c070]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 10:22:06 np0005538960 systemd[1]: Starting Hostname Service...
Nov 28 10:22:06 np0005538960 systemd[1]: Started Hostname Service.
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5510] hostname: hostname: using hostnamed
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5511] hostname: static hostname changed from (none) to "np0005538960.novalocal"
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5517] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5522] manager[0x55bef017c070]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5522] manager[0x55bef017c070]: rfkill: WWAN hardware radio set enabled
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5550] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5550] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5551] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5551] manager: Networking is enabled by state file
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5553] settings: Loaded settings plugin: keyfile (internal)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5556] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5581] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5590] dhcp: init: Using DHCP client 'internal'
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5592] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5597] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5602] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5609] device (lo): Activation: starting connection 'lo' (4b798dc0-9e3b-4d95-858d-136ef1166398)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5615] device (eth0): carrier: link connected
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5618] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5622] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5622] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5627] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5631] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5635] device (eth1): carrier: link connected
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5638] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5642] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (526edb83-c117-3f66-855f-1f77ff3e3e99) (indicated)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5642] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5646] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5651] device (eth1): Activation: starting connection 'Wired connection 1' (526edb83-c117-3f66-855f-1f77ff3e3e99)
Nov 28 10:22:06 np0005538960 systemd[1]: Started Network Manager.
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5656] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5659] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5660] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5662] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5664] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5667] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5669] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5671] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5672] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5676] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5678] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5685] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5686] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5701] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5703] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5705] device (lo): Activation: successful, device activated.
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5723] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.5727] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 10:22:06 np0005538960 systemd[1]: Starting Network Manager Wait Online...
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.6175] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.6203] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.6205] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.6212] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.6217] device (eth0): Activation: successful, device activated.
Nov 28 10:22:06 np0005538960 NetworkManager[7222]: <info>  [1764343326.6223] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 10:22:06 np0005538960 python3[7289]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-39ef-da6f-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:22:16 np0005538960 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 10:22:36 np0005538960 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.3666] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 10:22:51 np0005538960 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 10:22:51 np0005538960 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.3938] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.3940] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.3947] device (eth1): Activation: successful, device activated.
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.3953] manager: startup complete
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.3954] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <warn>  [1764343371.3958] device (eth1): Activation: failed for connection 'Wired connection 1'
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.3965] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Nov 28 10:22:51 np0005538960 systemd[1]: Finished Network Manager Wait Online.
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4089] dhcp4 (eth1): canceled DHCP transaction
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4090] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4090] dhcp4 (eth1): state changed no lease
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4106] policy: auto-activating connection 'ci-private-network' (b8ecb49a-bdd7-562b-970f-30977a791a02)
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4111] device (eth1): Activation: starting connection 'ci-private-network' (b8ecb49a-bdd7-562b-970f-30977a791a02)
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4112] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4117] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4125] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4133] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4181] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4184] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:22:51 np0005538960 NetworkManager[7222]: <info>  [1764343371.4194] device (eth1): Activation: successful, device activated.
Nov 28 10:23:01 np0005538960 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 10:23:06 np0005538960 systemd[1]: session-3.scope: Deactivated successfully.
Nov 28 10:23:06 np0005538960 systemd[1]: session-3.scope: Consumed 1.631s CPU time.
Nov 28 10:23:06 np0005538960 systemd-logind[788]: Session 3 logged out. Waiting for processes to exit.
Nov 28 10:23:06 np0005538960 systemd-logind[788]: Removed session 3.
Nov 28 10:23:34 np0005538960 systemd-logind[788]: New session 4 of user zuul.
Nov 28 10:23:34 np0005538960 systemd[1]: Started Session 4 of User zuul.
Nov 28 10:23:34 np0005538960 python3[7401]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:23:34 np0005538960 python3[7474]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764343414.278208-365-44497423092648/source _original_basename=tmpd14eymyo follow=False checksum=54e5374e15288aeb31d5a04b016984fdaf646538 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:23:38 np0005538960 systemd[1]: session-4.scope: Deactivated successfully.
Nov 28 10:23:38 np0005538960 systemd-logind[788]: Session 4 logged out. Waiting for processes to exit.
Nov 28 10:23:38 np0005538960 systemd-logind[788]: Removed session 4.
Nov 28 10:24:59 np0005538960 systemd[4323]: Created slice User Background Tasks Slice.
Nov 28 10:24:59 np0005538960 systemd[4323]: Starting Cleanup of User's Temporary Files and Directories...
Nov 28 10:24:59 np0005538960 systemd[4323]: Finished Cleanup of User's Temporary Files and Directories.
Nov 28 10:25:49 np0005538960 systemd[1]: Starting dnf makecache...
Nov 28 10:25:49 np0005538960 dnf[7506]: Failed determining last makecache time.
Nov 28 10:25:49 np0005538960 dnf[7506]: CentOS Stream 9 - BaseOS                         49 kB/s | 7.3 kB     00:00
Nov 28 10:25:49 np0005538960 dnf[7506]: CentOS Stream 9 - AppStream                      74 kB/s | 7.4 kB     00:00
Nov 28 10:25:49 np0005538960 dnf[7506]: CentOS Stream 9 - CRB                            73 kB/s | 7.2 kB     00:00
Nov 28 10:25:50 np0005538960 dnf[7506]: CentOS Stream 9 - Extras packages                71 kB/s | 8.3 kB     00:00
Nov 28 10:25:50 np0005538960 dnf[7506]: Metadata cache created.
Nov 28 10:25:50 np0005538960 systemd[1]: dnf-makecache.service: Deactivated successfully.
Nov 28 10:25:50 np0005538960 systemd[1]: Finished dnf makecache.
Nov 28 10:29:37 np0005538960 systemd-logind[788]: New session 5 of user zuul.
Nov 28 10:29:37 np0005538960 systemd[1]: Started Session 5 of User zuul.
Nov 28 10:29:38 np0005538960 python3[7556]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-5d4b-48fd-000000001cd2-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:29:38 np0005538960 python3[7585]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:29:38 np0005538960 python3[7611]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:29:38 np0005538960 python3[7637]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:29:39 np0005538960 python3[7663]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:29:39 np0005538960 python3[7689]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:29:40 np0005538960 python3[7767]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:29:40 np0005538960 python3[7840]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764343780.2063265-507-274325412434026/source _original_basename=tmpe6piednz follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:29:41 np0005538960 python3[7890]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 10:29:41 np0005538960 systemd[1]: Reloading.
Nov 28 10:29:41 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:29:43 np0005538960 python3[7945]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Nov 28 10:29:43 np0005538960 python3[7971]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:29:44 np0005538960 python3[7999]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:29:44 np0005538960 python3[8027]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:29:44 np0005538960 python3[8055]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:29:45 np0005538960 python3[8082]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-5d4b-48fd-000000001cd9-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:29:45 np0005538960 python3[8112]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Nov 28 10:29:48 np0005538960 systemd-logind[788]: Session 5 logged out. Waiting for processes to exit.
Nov 28 10:29:48 np0005538960 systemd[1]: session-5.scope: Deactivated successfully.
Nov 28 10:29:48 np0005538960 systemd[1]: session-5.scope: Consumed 4.137s CPU time.
Nov 28 10:29:48 np0005538960 systemd-logind[788]: Removed session 5.
Nov 28 10:29:50 np0005538960 systemd-logind[788]: New session 6 of user zuul.
Nov 28 10:29:50 np0005538960 systemd[1]: Started Session 6 of User zuul.
Nov 28 10:29:50 np0005538960 python3[8145]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Nov 28 10:30:03 np0005538960 kernel: SELinux:  Converting 385 SID table entries...
Nov 28 10:30:03 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 10:30:03 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 10:30:03 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 10:30:03 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 10:30:03 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 10:30:03 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 10:30:03 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 10:30:09 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=2 res=1
Nov 28 10:30:09 np0005538960 systemd[1]: Starting Cleanup of Temporary Directories...
Nov 28 10:30:09 np0005538960 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Nov 28 10:30:09 np0005538960 systemd[1]: Finished Cleanup of Temporary Directories.
Nov 28 10:30:09 np0005538960 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Nov 28 10:30:12 np0005538960 kernel: SELinux:  Converting 385 SID table entries...
Nov 28 10:30:12 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 10:30:12 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 10:30:12 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 10:30:12 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 10:30:12 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 10:30:12 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 10:30:12 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 10:30:21 np0005538960 kernel: SELinux:  Converting 385 SID table entries...
Nov 28 10:30:21 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 10:30:21 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 10:30:21 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 10:30:21 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 10:30:21 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 10:30:21 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 10:30:21 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 10:30:22 np0005538960 setsebool[8208]: The virt_use_nfs policy boolean was changed to 1 by root
Nov 28 10:30:22 np0005538960 setsebool[8208]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Nov 28 10:30:33 np0005538960 kernel: SELinux:  Converting 388 SID table entries...
Nov 28 10:30:33 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 10:30:33 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 10:30:33 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 10:30:33 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 10:30:33 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 10:30:33 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 10:30:33 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 10:30:51 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 28 10:30:51 np0005538960 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 10:30:51 np0005538960 systemd[1]: Starting man-db-cache-update.service...
Nov 28 10:30:51 np0005538960 systemd[1]: Reloading.
Nov 28 10:30:51 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:30:52 np0005538960 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 10:31:05 np0005538960 python3[17305]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163e3b-3c83-1b0e-8006-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:31:06 np0005538960 kernel: evm: overlay not supported
Nov 28 10:31:06 np0005538960 systemd[4323]: Starting D-Bus User Message Bus...
Nov 28 10:31:06 np0005538960 dbus-broker-launch[17822]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Nov 28 10:31:06 np0005538960 dbus-broker-launch[17822]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Nov 28 10:31:06 np0005538960 systemd[4323]: Started D-Bus User Message Bus.
Nov 28 10:31:06 np0005538960 dbus-broker-lau[17822]: Ready
Nov 28 10:31:06 np0005538960 systemd[4323]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Nov 28 10:31:06 np0005538960 systemd[4323]: Created slice Slice /user.
Nov 28 10:31:06 np0005538960 systemd[4323]: podman-17716.scope: unit configures an IP firewall, but not running as root.
Nov 28 10:31:06 np0005538960 systemd[4323]: (This warning is only shown for the first unit using IP firewalling.)
Nov 28 10:31:06 np0005538960 systemd[4323]: Started podman-17716.scope.
Nov 28 10:31:06 np0005538960 systemd[4323]: Started podman-pause-374c6a61.scope.
Nov 28 10:31:07 np0005538960 python3[18159]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.9:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.9:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:31:07 np0005538960 python3[18159]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Nov 28 10:31:07 np0005538960 systemd-logind[788]: Session 6 logged out. Waiting for processes to exit.
Nov 28 10:31:07 np0005538960 systemd[1]: session-6.scope: Deactivated successfully.
Nov 28 10:31:07 np0005538960 systemd[1]: session-6.scope: Consumed 59.437s CPU time.
Nov 28 10:31:07 np0005538960 systemd-logind[788]: Removed session 6.
Nov 28 10:31:32 np0005538960 systemd-logind[788]: New session 7 of user zuul.
Nov 28 10:31:32 np0005538960 systemd[1]: Started Session 7 of User zuul.
Nov 28 10:31:32 np0005538960 python3[28082]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOx64chRohxneiVaFzjRH8KF4dahaH6Btk3ckclPxHJCS2KA/fsYCopWP0JFW5DeRGQ7SzUBn9rpsrRxcBcLpAc= zuul@np0005538958.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:31:33 np0005538960 python3[28265]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOx64chRohxneiVaFzjRH8KF4dahaH6Btk3ckclPxHJCS2KA/fsYCopWP0JFW5DeRGQ7SzUBn9rpsrRxcBcLpAc= zuul@np0005538958.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:31:34 np0005538960 python3[28614]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005538960.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Nov 28 10:31:34 np0005538960 python3[28884]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOx64chRohxneiVaFzjRH8KF4dahaH6Btk3ckclPxHJCS2KA/fsYCopWP0JFW5DeRGQ7SzUBn9rpsrRxcBcLpAc= zuul@np0005538958.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Nov 28 10:31:35 np0005538960 python3[29198]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:31:35 np0005538960 python3[29388]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764343894.784945-168-170435480370071/source _original_basename=tmp1plsiowp follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:31:36 np0005538960 python3[29640]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Nov 28 10:31:36 np0005538960 systemd[1]: Starting Hostname Service...
Nov 28 10:31:36 np0005538960 systemd[1]: Started Hostname Service.
Nov 28 10:31:36 np0005538960 systemd-hostnamed[29678]: Changed pretty hostname to 'compute-1'
Nov 28 10:31:36 np0005538960 systemd-hostnamed[29678]: Hostname set to <compute-1> (static)
Nov 28 10:31:36 np0005538960 NetworkManager[7222]: <info>  [1764343896.6272] hostname: static hostname changed from "np0005538960.novalocal" to "compute-1"
Nov 28 10:31:36 np0005538960 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 10:31:36 np0005538960 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 10:31:36 np0005538960 systemd-logind[788]: Session 7 logged out. Waiting for processes to exit.
Nov 28 10:31:36 np0005538960 systemd[1]: session-7.scope: Deactivated successfully.
Nov 28 10:31:36 np0005538960 systemd[1]: session-7.scope: Consumed 2.198s CPU time.
Nov 28 10:31:36 np0005538960 systemd-logind[788]: Removed session 7.
Nov 28 10:31:37 np0005538960 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 10:31:37 np0005538960 systemd[1]: Finished man-db-cache-update.service.
Nov 28 10:31:37 np0005538960 systemd[1]: man-db-cache-update.service: Consumed 53.826s CPU time.
Nov 28 10:31:37 np0005538960 systemd[1]: run-r0556f7261fe84a319c8d040b558f1475.service: Deactivated successfully.
Nov 28 10:31:46 np0005538960 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 10:32:06 np0005538960 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 10:35:06 np0005538960 systemd-logind[788]: New session 8 of user zuul.
Nov 28 10:35:06 np0005538960 systemd[1]: Started Session 8 of User zuul.
Nov 28 10:35:07 np0005538960 python3[30103]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:35:08 np0005538960 python3[30219]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:35:09 np0005538960 python3[30292]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764344108.493966-34008-153147276120333/source mode=0755 _original_basename=delorean.repo follow=False checksum=a16f090252000d02a7f7d540bb10f7c1c9cd4ac5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:35:09 np0005538960 python3[30318]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:35:09 np0005538960 python3[30391]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764344108.493966-34008-153147276120333/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:35:10 np0005538960 python3[30417]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:35:10 np0005538960 python3[30490]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764344108.493966-34008-153147276120333/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:35:10 np0005538960 python3[30516]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:35:11 np0005538960 python3[30589]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764344108.493966-34008-153147276120333/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:35:11 np0005538960 python3[30615]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:35:11 np0005538960 python3[30688]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764344108.493966-34008-153147276120333/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:35:11 np0005538960 python3[30714]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:35:12 np0005538960 python3[30787]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764344108.493966-34008-153147276120333/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:35:12 np0005538960 python3[30813]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Nov 28 10:35:12 np0005538960 python3[30886]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1764344108.493966-34008-153147276120333/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=25e801a9a05537c191e2aa500f19076ac31d3e5b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:35:24 np0005538960 python3[30934]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:40:23 np0005538960 systemd[1]: session-8.scope: Deactivated successfully.
Nov 28 10:40:23 np0005538960 systemd[1]: session-8.scope: Consumed 5.005s CPU time.
Nov 28 10:40:23 np0005538960 systemd-logind[788]: Session 8 logged out. Waiting for processes to exit.
Nov 28 10:40:23 np0005538960 systemd-logind[788]: Removed session 8.
Nov 28 10:48:36 np0005538960 systemd-logind[788]: New session 9 of user zuul.
Nov 28 10:48:36 np0005538960 systemd[1]: Started Session 9 of User zuul.
Nov 28 10:48:37 np0005538960 python3.9[31109]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:48:39 np0005538960 python3.9[31290]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:48:47 np0005538960 systemd[1]: session-9.scope: Deactivated successfully.
Nov 28 10:48:47 np0005538960 systemd[1]: session-9.scope: Consumed 8.447s CPU time.
Nov 28 10:48:47 np0005538960 systemd-logind[788]: Session 9 logged out. Waiting for processes to exit.
Nov 28 10:48:47 np0005538960 systemd-logind[788]: Removed session 9.
Nov 28 10:48:53 np0005538960 systemd-logind[788]: New session 10 of user zuul.
Nov 28 10:48:53 np0005538960 systemd[1]: Started Session 10 of User zuul.
Nov 28 10:48:54 np0005538960 python3.9[31503]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:48:54 np0005538960 systemd[1]: session-10.scope: Deactivated successfully.
Nov 28 10:48:54 np0005538960 systemd-logind[788]: Session 10 logged out. Waiting for processes to exit.
Nov 28 10:48:54 np0005538960 systemd-logind[788]: Removed session 10.
Nov 28 10:49:11 np0005538960 systemd-logind[788]: New session 11 of user zuul.
Nov 28 10:49:11 np0005538960 systemd[1]: Started Session 11 of User zuul.
Nov 28 10:49:12 np0005538960 python3.9[31683]: ansible-ansible.legacy.ping Invoked with data=pong
Nov 28 10:49:14 np0005538960 python3.9[31857]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:49:15 np0005538960 python3.9[32009]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:49:16 np0005538960 python3.9[32162]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:49:17 np0005538960 python3.9[32314]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:49:17 np0005538960 python3.9[32466]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:49:18 np0005538960 python3.9[32589]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764344957.4490721-178-46393769370160/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:49:19 np0005538960 python3.9[32741]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:49:20 np0005538960 python3.9[32897]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:49:21 np0005538960 python3.9[33049]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:49:22 np0005538960 python3.9[33199]: ansible-ansible.builtin.service_facts Invoked
Nov 28 10:49:27 np0005538960 python3.9[33453]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:49:28 np0005538960 python3.9[33603]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:49:30 np0005538960 python3.9[33757]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:49:31 np0005538960 python3.9[33915]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:49:32 np0005538960 python3.9[33999]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:50:13 np0005538960 systemd[1]: Reloading.
Nov 28 10:50:13 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:50:14 np0005538960 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Nov 28 10:50:14 np0005538960 systemd[1]: Reloading.
Nov 28 10:50:14 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:50:14 np0005538960 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Nov 28 10:50:14 np0005538960 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Nov 28 10:50:14 np0005538960 systemd[1]: Reloading.
Nov 28 10:50:14 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:50:14 np0005538960 systemd[1]: Listening on LVM2 poll daemon socket.
Nov 28 10:50:15 np0005538960 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Nov 28 10:50:15 np0005538960 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Nov 28 10:50:15 np0005538960 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Nov 28 10:51:21 np0005538960 kernel: SELinux:  Converting 2717 SID table entries...
Nov 28 10:51:21 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 10:51:21 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 10:51:21 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 10:51:21 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 10:51:21 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 10:51:21 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 10:51:21 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 10:51:21 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Nov 28 10:51:21 np0005538960 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 10:51:21 np0005538960 systemd[1]: Starting man-db-cache-update.service...
Nov 28 10:51:21 np0005538960 systemd[1]: Reloading.
Nov 28 10:51:21 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:51:21 np0005538960 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 10:51:23 np0005538960 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 10:51:23 np0005538960 systemd[1]: Finished man-db-cache-update.service.
Nov 28 10:51:23 np0005538960 systemd[1]: man-db-cache-update.service: Consumed 1.697s CPU time.
Nov 28 10:51:23 np0005538960 systemd[1]: run-rb623a017684f49be8a11a8d17467d4f4.service: Deactivated successfully.
Nov 28 10:51:24 np0005538960 python3.9[35529]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:51:26 np0005538960 python3.9[35810]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Nov 28 10:51:27 np0005538960 python3.9[35962]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Nov 28 10:51:32 np0005538960 python3.9[36115]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:51:33 np0005538960 python3.9[36267]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Nov 28 10:51:37 np0005538960 python3.9[36419]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:51:42 np0005538960 python3.9[36572]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:51:43 np0005538960 python3.9[36695]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345097.5576935-668-117204067278312/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eedf92a785b2ae02f8a70da1ae583bf08644d85e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:51:44 np0005538960 python3.9[36847]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:51:45 np0005538960 python3.9[36999]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:51:45 np0005538960 python3.9[37152]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:51:47 np0005538960 python3.9[37304]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Nov 28 10:51:47 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:51:48 np0005538960 python3.9[37458]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 10:51:49 np0005538960 python3.9[37616]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 10:51:50 np0005538960 python3.9[37776]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Nov 28 10:51:51 np0005538960 python3.9[37929]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 10:51:52 np0005538960 python3.9[38087]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Nov 28 10:51:53 np0005538960 python3.9[38239]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:51:56 np0005538960 python3.9[38392]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:51:56 np0005538960 python3.9[38544]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:51:57 np0005538960 python3.9[38667]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345116.3891306-1024-38235841476328/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:51:58 np0005538960 python3.9[38819]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 10:51:58 np0005538960 systemd[1]: Starting Load Kernel Modules...
Nov 28 10:51:58 np0005538960 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Nov 28 10:51:58 np0005538960 kernel: Bridge firewalling registered
Nov 28 10:51:58 np0005538960 systemd-modules-load[38823]: Inserted module 'br_netfilter'
Nov 28 10:51:58 np0005538960 systemd[1]: Finished Load Kernel Modules.
Nov 28 10:51:59 np0005538960 python3.9[38979]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:52:00 np0005538960 python3.9[39102]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345119.2597466-1094-117523742787788/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:52:01 np0005538960 python3.9[39254]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:52:04 np0005538960 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Nov 28 10:52:04 np0005538960 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Nov 28 10:52:05 np0005538960 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 10:52:05 np0005538960 systemd[1]: Starting man-db-cache-update.service...
Nov 28 10:52:05 np0005538960 systemd[1]: Reloading.
Nov 28 10:52:05 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:52:05 np0005538960 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 10:52:07 np0005538960 python3.9[40857]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:52:08 np0005538960 python3.9[42033]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Nov 28 10:52:09 np0005538960 python3.9[42802]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:52:09 np0005538960 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 10:52:09 np0005538960 systemd[1]: Finished man-db-cache-update.service.
Nov 28 10:52:09 np0005538960 systemd[1]: man-db-cache-update.service: Consumed 5.626s CPU time.
Nov 28 10:52:09 np0005538960 systemd[1]: run-rec1f81e004924ea09676cadda3747048.service: Deactivated successfully.
Nov 28 10:52:10 np0005538960 python3.9[43416]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:52:10 np0005538960 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 10:52:10 np0005538960 systemd[1]: Starting Authorization Manager...
Nov 28 10:52:10 np0005538960 polkitd[43633]: Started polkitd version 0.117
Nov 28 10:52:10 np0005538960 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 10:52:10 np0005538960 systemd[1]: Started Authorization Manager.
Nov 28 10:52:11 np0005538960 python3.9[43803]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:52:12 np0005538960 systemd[1]: Stopping Dynamic System Tuning Daemon...
Nov 28 10:52:12 np0005538960 systemd[1]: tuned.service: Deactivated successfully.
Nov 28 10:52:12 np0005538960 systemd[1]: Stopped Dynamic System Tuning Daemon.
Nov 28 10:52:12 np0005538960 systemd[1]: Starting Dynamic System Tuning Daemon...
Nov 28 10:52:12 np0005538960 systemd[1]: Started Dynamic System Tuning Daemon.
Nov 28 10:52:13 np0005538960 python3.9[43965]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Nov 28 10:52:17 np0005538960 python3.9[44117]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:52:17 np0005538960 systemd[1]: Reloading.
Nov 28 10:52:17 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:52:18 np0005538960 python3.9[44307]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:52:19 np0005538960 systemd[1]: Reloading.
Nov 28 10:52:19 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:52:20 np0005538960 python3.9[44495]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:52:21 np0005538960 python3.9[44648]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:52:21 np0005538960 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Nov 28 10:52:22 np0005538960 python3.9[44801]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:52:24 np0005538960 python3.9[44965]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:52:25 np0005538960 python3.9[45118]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 10:52:25 np0005538960 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Nov 28 10:52:25 np0005538960 systemd[1]: Stopped Apply Kernel Variables.
Nov 28 10:52:25 np0005538960 systemd[1]: Stopping Apply Kernel Variables...
Nov 28 10:52:25 np0005538960 systemd[1]: Starting Apply Kernel Variables...
Nov 28 10:52:25 np0005538960 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Nov 28 10:52:25 np0005538960 systemd[1]: Finished Apply Kernel Variables.
Nov 28 10:52:26 np0005538960 systemd[1]: session-11.scope: Deactivated successfully.
Nov 28 10:52:26 np0005538960 systemd[1]: session-11.scope: Consumed 2min 19.896s CPU time.
Nov 28 10:52:26 np0005538960 systemd-logind[788]: Session 11 logged out. Waiting for processes to exit.
Nov 28 10:52:26 np0005538960 systemd-logind[788]: Removed session 11.
Nov 28 10:52:32 np0005538960 systemd-logind[788]: New session 12 of user zuul.
Nov 28 10:52:32 np0005538960 systemd[1]: Started Session 12 of User zuul.
Nov 28 10:52:33 np0005538960 python3.9[45301]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:52:35 np0005538960 python3.9[45455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:52:36 np0005538960 python3.9[45611]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:52:37 np0005538960 python3.9[45762]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:52:38 np0005538960 python3.9[45918]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:52:39 np0005538960 python3.9[46002]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:52:42 np0005538960 python3.9[46155]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:52:43 np0005538960 python3.9[46326]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:52:44 np0005538960 python3.9[46478]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:52:44 np0005538960 systemd[1]: var-lib-containers-storage-overlay-compat1364874775-merged.mount: Deactivated successfully.
Nov 28 10:52:44 np0005538960 podman[46479]: 2025-11-28 15:52:44.164398041 +0000 UTC m=+0.059304023 system refresh
Nov 28 10:52:45 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:52:45 np0005538960 python3.9[46641]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:52:45 np0005538960 python3.9[46764]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345164.4721382-288-224466356739046/.source.json follow=False _original_basename=podman_network_config.j2 checksum=2489dde3d37adf8a25b3f37ece321abefadc7143 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:52:46 np0005538960 python3.9[46916]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:52:47 np0005538960 python3.9[47039]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345166.1388354-333-186760123354182/.source.conf follow=False _original_basename=registries.conf.j2 checksum=361187d514446359e6e2ae433816ff3f7630be78 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:52:48 np0005538960 python3.9[47191]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:52:49 np0005538960 python3.9[47343]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:52:49 np0005538960 python3.9[47495]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:52:50 np0005538960 python3.9[47647]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:52:51 np0005538960 python3.9[47797]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:52:52 np0005538960 python3.9[47951]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:52:54 np0005538960 python3.9[48104]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:52:57 np0005538960 python3.9[48264]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:53:00 np0005538960 python3.9[48417]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:53:02 np0005538960 python3.9[48570]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:53:04 np0005538960 python3.9[48726]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:53:09 np0005538960 python3.9[48896]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:53:11 np0005538960 python3.9[49049]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:53:26 np0005538960 python3.9[49385]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:53:29 np0005538960 python3.9[49541]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:53:30 np0005538960 python3.9[49716]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:53:30 np0005538960 python3.9[49839]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1764345209.6561193-777-132988554758236/.source.json _original_basename=.x7zdnlvv follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:53:32 np0005538960 python3.9[49991]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 10:53:32 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:35 np0005538960 systemd[1]: var-lib-containers-storage-overlay-compat961245748-lower\x2dmapped.mount: Deactivated successfully.
Nov 28 10:53:38 np0005538960 podman[50003]: 2025-11-28 15:53:38.521681469 +0000 UTC m=+6.189353156 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 10:53:38 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:38 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:38 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:39 np0005538960 python3.9[50297]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 10:53:40 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:52 np0005538960 podman[50310]: 2025-11-28 15:53:52.179629397 +0000 UTC m=+12.147846131 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 10:53:52 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:52 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:52 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:53 np0005538960 python3.9[50607]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 10:53:53 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:54 np0005538960 podman[50619]: 2025-11-28 15:53:54.717257178 +0000 UTC m=+1.125695658 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 10:53:54 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:54 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:54 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:53:55 np0005538960 python3.9[50853]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 10:53:55 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:11 np0005538960 podman[50865]: 2025-11-28 15:54:11.836151418 +0000 UTC m=+15.910471999 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 10:54:11 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:11 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:11 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:13 np0005538960 python3.9[51127]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 10:54:13 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:19 np0005538960 podman[51139]: 2025-11-28 15:54:19.328717225 +0000 UTC m=+6.120952210 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 10:54:19 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:19 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:19 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:20 np0005538960 python3.9[51396]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Nov 28 10:54:20 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:21 np0005538960 podman[51408]: 2025-11-28 15:54:21.962842738 +0000 UTC m=+1.405466726 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 28 10:54:21 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:22 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:22 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:54:23 np0005538960 systemd[1]: session-12.scope: Deactivated successfully.
Nov 28 10:54:23 np0005538960 systemd[1]: session-12.scope: Consumed 1min 59.977s CPU time.
Nov 28 10:54:23 np0005538960 systemd-logind[788]: Session 12 logged out. Waiting for processes to exit.
Nov 28 10:54:23 np0005538960 systemd-logind[788]: Removed session 12.
Nov 28 10:54:28 np0005538960 systemd-logind[788]: New session 13 of user zuul.
Nov 28 10:54:28 np0005538960 systemd[1]: Started Session 13 of User zuul.
Nov 28 10:54:29 np0005538960 python3.9[51708]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:54:31 np0005538960 python3.9[51864]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Nov 28 10:54:32 np0005538960 python3.9[52017]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 10:54:33 np0005538960 python3.9[52175]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 10:54:34 np0005538960 python3.9[52335]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:54:35 np0005538960 python3.9[52419]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:54:38 np0005538960 python3.9[52580]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:54:52 np0005538960 kernel: SELinux:  Converting 2730 SID table entries...
Nov 28 10:54:52 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 10:54:52 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 10:54:52 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 10:54:52 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 10:54:52 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 10:54:52 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 10:54:52 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 10:54:53 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Nov 28 10:54:53 np0005538960 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Nov 28 10:54:54 np0005538960 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 10:54:54 np0005538960 systemd[1]: Starting man-db-cache-update.service...
Nov 28 10:54:54 np0005538960 systemd[1]: Reloading.
Nov 28 10:54:54 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:54:54 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:54:54 np0005538960 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 10:54:55 np0005538960 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 10:54:55 np0005538960 systemd[1]: Finished man-db-cache-update.service.
Nov 28 10:54:55 np0005538960 systemd[1]: run-rcebede93254144feab6b29a3219e03aa.service: Deactivated successfully.
Nov 28 10:54:56 np0005538960 irqbalance[782]: Cannot change IRQ 27 affinity: Operation not permitted
Nov 28 10:54:56 np0005538960 irqbalance[782]: IRQ 27 affinity is now unmanaged
Nov 28 10:54:57 np0005538960 python3.9[53679]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 10:54:57 np0005538960 systemd[1]: Reloading.
Nov 28 10:54:57 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:54:57 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:54:57 np0005538960 systemd[1]: Starting Open vSwitch Database Unit...
Nov 28 10:54:57 np0005538960 chown[53720]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Nov 28 10:54:57 np0005538960 ovs-ctl[53725]: /etc/openvswitch/conf.db does not exist ... (warning).
Nov 28 10:54:57 np0005538960 ovs-ctl[53725]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Nov 28 10:54:57 np0005538960 ovs-ctl[53725]: Starting ovsdb-server [  OK  ]
Nov 28 10:54:57 np0005538960 ovs-vsctl[53774]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Nov 28 10:54:57 np0005538960 ovs-vsctl[53794]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"ac0d1e81-02b2-487b-bc65-46ccb331e9e4\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Nov 28 10:54:57 np0005538960 ovs-ctl[53725]: Configuring Open vSwitch system IDs [  OK  ]
Nov 28 10:54:58 np0005538960 ovs-ctl[53725]: Enabling remote OVSDB managers [  OK  ]
Nov 28 10:54:58 np0005538960 systemd[1]: Started Open vSwitch Database Unit.
Nov 28 10:54:58 np0005538960 ovs-vsctl[53800]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 28 10:54:58 np0005538960 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Nov 28 10:54:58 np0005538960 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Nov 28 10:54:58 np0005538960 systemd[1]: Starting Open vSwitch Forwarding Unit...
Nov 28 10:54:58 np0005538960 kernel: openvswitch: Open vSwitch switching datapath
Nov 28 10:54:58 np0005538960 ovs-ctl[53845]: Inserting openvswitch module [  OK  ]
Nov 28 10:54:58 np0005538960 ovs-ctl[53814]: Starting ovs-vswitchd [  OK  ]
Nov 28 10:54:58 np0005538960 ovs-vsctl[53862]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Nov 28 10:54:58 np0005538960 ovs-ctl[53814]: Enabling remote OVSDB managers [  OK  ]
Nov 28 10:54:58 np0005538960 systemd[1]: Started Open vSwitch Forwarding Unit.
Nov 28 10:54:58 np0005538960 systemd[1]: Starting Open vSwitch...
Nov 28 10:54:58 np0005538960 systemd[1]: Finished Open vSwitch.
Nov 28 10:54:59 np0005538960 python3.9[54014]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:55:00 np0005538960 python3.9[54166]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Nov 28 10:55:01 np0005538960 kernel: SELinux:  Converting 2744 SID table entries...
Nov 28 10:55:01 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 10:55:01 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 10:55:01 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 10:55:01 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 10:55:01 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 10:55:01 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 10:55:01 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 10:55:03 np0005538960 python3.9[54321]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:55:04 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Nov 28 10:55:04 np0005538960 python3.9[54479]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:55:06 np0005538960 python3.9[54632]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:55:08 np0005538960 python3.9[54919]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 10:55:09 np0005538960 python3.9[55069]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:55:10 np0005538960 python3.9[55223]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:55:12 np0005538960 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 10:55:12 np0005538960 systemd[1]: Starting man-db-cache-update.service...
Nov 28 10:55:12 np0005538960 systemd[1]: Reloading.
Nov 28 10:55:12 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:55:12 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:55:12 np0005538960 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 10:55:12 np0005538960 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 10:55:12 np0005538960 systemd[1]: Finished man-db-cache-update.service.
Nov 28 10:55:12 np0005538960 systemd[1]: run-r247866d96a17470388dff390f44c6fb7.service: Deactivated successfully.
Nov 28 10:55:13 np0005538960 python3.9[55540]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 10:55:14 np0005538960 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Nov 28 10:55:14 np0005538960 systemd[1]: Stopped Network Manager Wait Online.
Nov 28 10:55:14 np0005538960 systemd[1]: Stopping Network Manager Wait Online...
Nov 28 10:55:14 np0005538960 systemd[1]: Stopping Network Manager...
Nov 28 10:55:14 np0005538960 NetworkManager[7222]: <info>  [1764345314.0313] caught SIGTERM, shutting down normally.
Nov 28 10:55:14 np0005538960 NetworkManager[7222]: <info>  [1764345314.0333] dhcp4 (eth0): canceled DHCP transaction
Nov 28 10:55:14 np0005538960 NetworkManager[7222]: <info>  [1764345314.0334] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:55:14 np0005538960 NetworkManager[7222]: <info>  [1764345314.0334] dhcp4 (eth0): state changed no lease
Nov 28 10:55:14 np0005538960 NetworkManager[7222]: <info>  [1764345314.0337] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 10:55:14 np0005538960 NetworkManager[7222]: <info>  [1764345314.0417] exiting (success)
Nov 28 10:55:14 np0005538960 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 10:55:14 np0005538960 systemd[1]: NetworkManager.service: Deactivated successfully.
Nov 28 10:55:14 np0005538960 systemd[1]: Stopped Network Manager.
Nov 28 10:55:14 np0005538960 systemd[1]: NetworkManager.service: Consumed 14.438s CPU time, 4.1M memory peak, read 0B from disk, written 31.5K to disk.
Nov 28 10:55:14 np0005538960 systemd[1]: Starting Network Manager...
Nov 28 10:55:14 np0005538960 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.1193] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:173012fb-6f90-4e2d-8a88-62d31adeba03)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.1194] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.1267] manager[0x55cbbdddf090]: monitoring kernel firmware directory '/lib/firmware'.
Nov 28 10:55:14 np0005538960 systemd[1]: Starting Hostname Service...
Nov 28 10:55:14 np0005538960 systemd[1]: Started Hostname Service.
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2443] hostname: hostname: using hostnamed
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2445] hostname: static hostname changed from (none) to "compute-1"
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2452] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2462] manager[0x55cbbdddf090]: rfkill: Wi-Fi hardware radio set enabled
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2463] manager[0x55cbbdddf090]: rfkill: WWAN hardware radio set enabled
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2500] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2516] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2517] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2518] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2520] manager: Networking is enabled by state file
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2523] settings: Loaded settings plugin: keyfile (internal)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2528] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2578] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2593] dhcp: init: Using DHCP client 'internal'
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2597] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2605] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2613] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2626] device (lo): Activation: starting connection 'lo' (4b798dc0-9e3b-4d95-858d-136ef1166398)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2638] device (eth0): carrier: link connected
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2645] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2654] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2655] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2665] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2677] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2686] device (eth1): carrier: link connected
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2693] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2701] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (b8ecb49a-bdd7-562b-970f-30977a791a02) (indicated)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2702] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2711] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2723] device (eth1): Activation: starting connection 'ci-private-network' (b8ecb49a-bdd7-562b-970f-30977a791a02)
Nov 28 10:55:14 np0005538960 systemd[1]: Started Network Manager.
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2732] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2760] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2764] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2767] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2771] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2775] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2780] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2784] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2788] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2801] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2807] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2823] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2849] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2864] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2868] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.2878] device (lo): Activation: successful, device activated.
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3117] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3124] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3198] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 systemd[1]: Starting Network Manager Wait Online...
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3208] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3210] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3212] manager: NetworkManager state is now CONNECTED_LOCAL
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3214] device (eth1): Activation: successful, device activated.
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3246] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3248] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3251] manager: NetworkManager state is now CONNECTED_SITE
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3253] device (eth0): Activation: successful, device activated.
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3257] manager: NetworkManager state is now CONNECTED_GLOBAL
Nov 28 10:55:14 np0005538960 NetworkManager[55548]: <info>  [1764345314.3259] manager: startup complete
Nov 28 10:55:14 np0005538960 systemd[1]: Finished Network Manager Wait Online.
Nov 28 10:55:15 np0005538960 python3.9[55766]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:55:20 np0005538960 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 10:55:20 np0005538960 systemd[1]: Starting man-db-cache-update.service...
Nov 28 10:55:20 np0005538960 systemd[1]: Reloading.
Nov 28 10:55:20 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:55:20 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:55:20 np0005538960 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 10:55:21 np0005538960 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 10:55:21 np0005538960 systemd[1]: Finished man-db-cache-update.service.
Nov 28 10:55:21 np0005538960 systemd[1]: run-r0d5c739283e74d608922c8b0606579cb.service: Deactivated successfully.
Nov 28 10:55:24 np0005538960 python3.9[56224]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:55:24 np0005538960 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 10:55:25 np0005538960 python3.9[56376]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:26 np0005538960 python3.9[56530]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:27 np0005538960 python3.9[56682]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:27 np0005538960 python3.9[56834]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:28 np0005538960 python3.9[56986]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:29 np0005538960 python3.9[57138]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:55:30 np0005538960 python3.9[57261]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345328.9175234-648-76435200747714/.source _original_basename=.e4f9dyz_ follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:31 np0005538960 python3.9[57413]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:31 np0005538960 python3.9[57565]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Nov 28 10:55:32 np0005538960 python3.9[57717]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:35 np0005538960 python3.9[58144]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Nov 28 10:55:36 np0005538960 ansible-async_wrapper.py[58319]: Invoked with j163210660891 300 /home/zuul/.ansible/tmp/ansible-tmp-1764345335.9310517-846-177387259930417/AnsiballZ_edpm_os_net_config.py _
Nov 28 10:55:36 np0005538960 ansible-async_wrapper.py[58322]: Starting module and watcher
Nov 28 10:55:36 np0005538960 ansible-async_wrapper.py[58322]: Start watching 58323 (300)
Nov 28 10:55:36 np0005538960 ansible-async_wrapper.py[58323]: Start module (58323)
Nov 28 10:55:36 np0005538960 ansible-async_wrapper.py[58319]: Return async_wrapper task started.
Nov 28 10:55:37 np0005538960 python3.9[58324]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Nov 28 10:55:37 np0005538960 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Nov 28 10:55:37 np0005538960 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Nov 28 10:55:37 np0005538960 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Nov 28 10:55:37 np0005538960 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Nov 28 10:55:37 np0005538960 kernel: cfg80211: failed to load regulatory.db
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.1503] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.1525] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2246] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2248] audit: op="connection-add" uuid="313c0cab-9d34-4d98-91e3-07586d799268" name="br-ex-br" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2271] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2272] audit: op="connection-add" uuid="4cc1133b-4cb4-4339-a912-bcc756711f4c" name="br-ex-port" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2289] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2290] audit: op="connection-add" uuid="6f8d87c7-aa41-4f11-bc92-6c846e2e7857" name="eth1-port" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2306] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2307] audit: op="connection-add" uuid="400f40ee-4ea3-4179-9afd-eb5c9244771b" name="vlan20-port" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2323] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2324] audit: op="connection-add" uuid="5d495c91-8d1f-4fc5-b963-095b1517a932" name="vlan21-port" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2340] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2342] audit: op="connection-add" uuid="d507bd3c-01af-40ca-a60c-c085d89d99cf" name="vlan22-port" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2368] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2388] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2390] audit: op="connection-add" uuid="fcec750a-b2b6-4c71-8b99-5b34363fdeb8" name="br-ex-if" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2474] audit: op="connection-update" uuid="b8ecb49a-bdd7-562b-970f-30977a791a02" name="ci-private-network" args="ovs-interface.type,connection.controller,connection.slave-type,connection.timestamp,connection.master,connection.port-type,ipv6.routing-rules,ipv6.addresses,ipv6.addr-gen-mode,ipv6.dns,ipv6.routes,ipv6.method,ipv4.routing-rules,ipv4.never-default,ipv4.addresses,ipv4.dns,ipv4.method,ipv4.routes,ovs-external-ids.data" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2495] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2497] audit: op="connection-add" uuid="086ff765-3681-4efe-84c2-deb00b15d25e" name="vlan20-if" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2518] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2519] audit: op="connection-add" uuid="ee11873d-a508-4fae-ae6a-6e0c03c7679b" name="vlan21-if" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2541] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2542] audit: op="connection-add" uuid="44dfb805-fea6-4c07-8232-9f9e9ad32945" name="vlan22-if" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2560] audit: op="connection-delete" uuid="526edb83-c117-3f66-855f-1f77ff3e3e99" name="Wired connection 1" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2582] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2602] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2610] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (313c0cab-9d34-4d98-91e3-07586d799268)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2611] audit: op="connection-activate" uuid="313c0cab-9d34-4d98-91e3-07586d799268" name="br-ex-br" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2614] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2626] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2633] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (4cc1133b-4cb4-4339-a912-bcc756711f4c)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2636] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2646] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2654] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (6f8d87c7-aa41-4f11-bc92-6c846e2e7857)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2657] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2669] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2674] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (400f40ee-4ea3-4179-9afd-eb5c9244771b)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2676] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2682] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2686] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (5d495c91-8d1f-4fc5-b963-095b1517a932)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2689] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2696] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2701] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (d507bd3c-01af-40ca-a60c-c085d89d99cf)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2702] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2704] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2706] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2716] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2721] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2726] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (fcec750a-b2b6-4c71-8b99-5b34363fdeb8)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2726] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2731] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2732] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2734] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2736] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2751] device (eth1): disconnecting for new activation request.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2752] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2755] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2757] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2758] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2761] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2765] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2772] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (086ff765-3681-4efe-84c2-deb00b15d25e)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2773] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2776] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2779] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2780] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2783] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2788] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2792] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (ee11873d-a508-4fae-ae6a-6e0c03c7679b)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2793] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2796] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2798] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2800] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2802] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2808] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2813] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (44dfb805-fea6-4c07-8232-9f9e9ad32945)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2814] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2816] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2819] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2820] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2822] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2840] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-timeout,ipv4.dhcp-client-id,802-3-ethernet.mtu" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2843] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2846] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2849] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2864] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 kernel: ovs-system: entered promiscuous mode
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2869] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2874] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2877] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2879] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2885] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2890] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2893] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2896] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2902] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2906] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 kernel: Timeout policy base is empty
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2909] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 systemd-udevd[58330]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2911] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2917] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2922] dhcp4 (eth0): canceled DHCP transaction
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2922] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2922] dhcp4 (eth0): state changed no lease
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2924] dhcp4 (eth0): activation: beginning transaction (no timeout)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2936] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2944] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58325 uid=0 result="fail" reason="Device is not activated"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2949] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2957] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2966] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2979] device (eth1): disconnecting for new activation request.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2981] audit: op="connection-activate" uuid="b8ecb49a-bdd7-562b-970f-30977a791a02" name="ci-private-network" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.2984] dhcp4 (eth0): state changed new lease, address=38.102.83.204
Nov 28 10:55:39 np0005538960 systemd[1]: Starting Network Manager Script Dispatcher Service...
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3057] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58325 uid=0 result="success"
Nov 28 10:55:39 np0005538960 systemd[1]: Started Network Manager Script Dispatcher Service.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3136] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3252] device (eth1): Activation: starting connection 'ci-private-network' (b8ecb49a-bdd7-562b-970f-30977a791a02)
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3262] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3274] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3278] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3287] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3293] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3301] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3302] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3303] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3304] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3305] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3322] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3329] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3334] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3338] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3342] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3346] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3349] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3355] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3359] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3366] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3371] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3379] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3396] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3444] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3446] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3454] device (eth1): Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 kernel: br-ex: entered promiscuous mode
Nov 28 10:55:39 np0005538960 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3688] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3701] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3738] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3741] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3751] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 kernel: vlan22: entered promiscuous mode
Nov 28 10:55:39 np0005538960 kernel: vlan20: entered promiscuous mode
Nov 28 10:55:39 np0005538960 systemd-udevd[58329]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3897] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3906] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3928] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3930] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.3939] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 kernel: vlan21: entered promiscuous mode
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4014] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4022] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4042] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4043] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4050] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4095] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4117] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4138] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4139] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Nov 28 10:55:39 np0005538960 NetworkManager[55548]: <info>  [1764345339.4149] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Nov 28 10:55:40 np0005538960 NetworkManager[55548]: <info>  [1764345340.5252] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58325 uid=0 result="success"
Nov 28 10:55:40 np0005538960 NetworkManager[55548]: <info>  [1764345340.6698] checkpoint[0x55cbbddb5950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Nov 28 10:55:40 np0005538960 NetworkManager[55548]: <info>  [1764345340.6701] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58325 uid=0 result="success"
Nov 28 10:55:40 np0005538960 python3.9[58657]: ansible-ansible.legacy.async_status Invoked with jid=j163210660891.58319 mode=status _async_dir=/root/.ansible_async
Nov 28 10:55:40 np0005538960 NetworkManager[55548]: <info>  [1764345340.9678] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58325 uid=0 result="success"
Nov 28 10:55:40 np0005538960 NetworkManager[55548]: <info>  [1764345340.9695] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58325 uid=0 result="success"
Nov 28 10:55:41 np0005538960 NetworkManager[55548]: <info>  [1764345341.2041] audit: op="networking-control" arg="global-dns-configuration" pid=58325 uid=0 result="success"
Nov 28 10:55:41 np0005538960 NetworkManager[55548]: <info>  [1764345341.2073] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Nov 28 10:55:41 np0005538960 NetworkManager[55548]: <info>  [1764345341.2104] audit: op="networking-control" arg="global-dns-configuration" pid=58325 uid=0 result="success"
Nov 28 10:55:41 np0005538960 NetworkManager[55548]: <info>  [1764345341.2142] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58325 uid=0 result="success"
Nov 28 10:55:41 np0005538960 NetworkManager[55548]: <info>  [1764345341.3450] checkpoint[0x55cbbddb5a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Nov 28 10:55:41 np0005538960 NetworkManager[55548]: <info>  [1764345341.3453] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58325 uid=0 result="success"
Nov 28 10:55:41 np0005538960 ansible-async_wrapper.py[58323]: Module complete (58323)
Nov 28 10:55:41 np0005538960 ansible-async_wrapper.py[58322]: Done in kid B.
Nov 28 10:55:44 np0005538960 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 10:55:44 np0005538960 python3.9[58763]: ansible-ansible.legacy.async_status Invoked with jid=j163210660891.58319 mode=status _async_dir=/root/.ansible_async
Nov 28 10:55:44 np0005538960 python3.9[58866]: ansible-ansible.legacy.async_status Invoked with jid=j163210660891.58319 mode=cleanup _async_dir=/root/.ansible_async
Nov 28 10:55:45 np0005538960 python3.9[59018]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:55:46 np0005538960 python3.9[59141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345345.2220533-927-131197542861224/.source.returncode _original_basename=.de_3g6h4 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:47 np0005538960 python3.9[59293]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:55:48 np0005538960 python3.9[59417]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345346.9457102-975-99921904698766/.source.cfg _original_basename=.xn9yw6cn follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:55:49 np0005538960 python3.9[59569]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 10:55:49 np0005538960 systemd[1]: Reloading Network Manager...
Nov 28 10:55:49 np0005538960 NetworkManager[55548]: <info>  [1764345349.2606] audit: op="reload" arg="0" pid=59573 uid=0 result="success"
Nov 28 10:55:49 np0005538960 NetworkManager[55548]: <info>  [1764345349.2617] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Nov 28 10:55:49 np0005538960 systemd[1]: Reloaded Network Manager.
Nov 28 10:55:49 np0005538960 systemd[1]: session-13.scope: Deactivated successfully.
Nov 28 10:55:49 np0005538960 systemd[1]: session-13.scope: Consumed 54.547s CPU time.
Nov 28 10:55:49 np0005538960 systemd-logind[788]: Session 13 logged out. Waiting for processes to exit.
Nov 28 10:55:49 np0005538960 systemd-logind[788]: Removed session 13.
Nov 28 10:55:55 np0005538960 systemd-logind[788]: New session 14 of user zuul.
Nov 28 10:55:55 np0005538960 systemd[1]: Started Session 14 of User zuul.
Nov 28 10:55:56 np0005538960 python3.9[59757]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:55:58 np0005538960 python3.9[59912]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:55:59 np0005538960 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Nov 28 10:55:59 np0005538960 python3.9[60101]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:56:00 np0005538960 systemd[1]: session-14.scope: Deactivated successfully.
Nov 28 10:56:00 np0005538960 systemd[1]: session-14.scope: Consumed 2.788s CPU time.
Nov 28 10:56:00 np0005538960 systemd-logind[788]: Session 14 logged out. Waiting for processes to exit.
Nov 28 10:56:00 np0005538960 systemd-logind[788]: Removed session 14.
Nov 28 10:56:05 np0005538960 systemd-logind[788]: New session 15 of user zuul.
Nov 28 10:56:05 np0005538960 systemd[1]: Started Session 15 of User zuul.
Nov 28 10:56:07 np0005538960 python3.9[60285]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:56:08 np0005538960 python3.9[60439]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:56:09 np0005538960 python3.9[60595]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:56:10 np0005538960 python3.9[60680]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:56:12 np0005538960 python3.9[60833]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:56:13 np0005538960 python3.9[61024]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:56:14 np0005538960 python3.9[61176]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:56:15 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 10:56:16 np0005538960 python3.9[61339]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:56:16 np0005538960 python3.9[61417]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:56:17 np0005538960 python3.9[61569]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:56:18 np0005538960 python3.9[61647]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:56:19 np0005538960 python3.9[61799]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:56:19 np0005538960 python3.9[61951]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:56:20 np0005538960 python3.9[62103]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:56:21 np0005538960 python3.9[62255]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:56:23 np0005538960 python3.9[62407]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:56:25 np0005538960 python3.9[62560]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:56:26 np0005538960 python3.9[62714]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:56:27 np0005538960 python3.9[62866]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:56:28 np0005538960 python3.9[63018]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:56:29 np0005538960 python3.9[63171]: ansible-service_facts Invoked
Nov 28 10:56:29 np0005538960 network[63188]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 10:56:29 np0005538960 network[63189]: 'network-scripts' will be removed from distribution in near future.
Nov 28 10:56:29 np0005538960 network[63190]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 10:56:37 np0005538960 python3.9[63642]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:56:40 np0005538960 python3.9[63795]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Nov 28 10:56:42 np0005538960 python3.9[63947]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:56:42 np0005538960 python3.9[64072]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345401.6107557-658-208980561705193/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:56:44 np0005538960 python3.9[64226]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:56:44 np0005538960 python3.9[64351]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345403.4355576-703-264129416609637/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:56:47 np0005538960 python3.9[64505]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:56:49 np0005538960 python3.9[64659]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:56:50 np0005538960 python3.9[64743]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:56:52 np0005538960 python3.9[64897]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:56:53 np0005538960 python3.9[64981]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 10:56:54 np0005538960 chronyd[797]: chronyd exiting
Nov 28 10:56:54 np0005538960 systemd[1]: Stopping NTP client/server...
Nov 28 10:56:54 np0005538960 systemd[1]: chronyd.service: Deactivated successfully.
Nov 28 10:56:54 np0005538960 systemd[1]: Stopped NTP client/server.
Nov 28 10:56:54 np0005538960 systemd[1]: Starting NTP client/server...
Nov 28 10:56:54 np0005538960 chronyd[64989]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Nov 28 10:56:54 np0005538960 chronyd[64989]: Frequency -23.688 +/- 0.312 ppm read from /var/lib/chrony/drift
Nov 28 10:56:54 np0005538960 chronyd[64989]: Loaded seccomp filter (level 2)
Nov 28 10:56:54 np0005538960 systemd[1]: Started NTP client/server.
Nov 28 10:56:54 np0005538960 systemd[1]: session-15.scope: Deactivated successfully.
Nov 28 10:56:54 np0005538960 systemd[1]: session-15.scope: Consumed 30.088s CPU time.
Nov 28 10:56:54 np0005538960 systemd-logind[788]: Session 15 logged out. Waiting for processes to exit.
Nov 28 10:56:54 np0005538960 systemd-logind[788]: Removed session 15.
Nov 28 10:57:00 np0005538960 systemd-logind[788]: New session 16 of user zuul.
Nov 28 10:57:00 np0005538960 systemd[1]: Started Session 16 of User zuul.
Nov 28 10:57:01 np0005538960 python3.9[65168]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:57:02 np0005538960 python3.9[65324]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:04 np0005538960 python3.9[65499]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:04 np0005538960 python3.9[65577]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.azbxih9l recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:05 np0005538960 python3.9[65729]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:06 np0005538960 python3.9[65852]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345425.2670782-144-265973347014487/.source _original_basename=.hirr3vc6 follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:07 np0005538960 python3.9[66004]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:57:08 np0005538960 python3.9[66156]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:09 np0005538960 python3.9[66279]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345428.056187-216-111700862839012/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:57:10 np0005538960 python3.9[66431]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:10 np0005538960 python3.9[66554]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345429.5342562-216-138525819771791/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:57:11 np0005538960 python3.9[66706]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:12 np0005538960 python3.9[66858]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:13 np0005538960 python3.9[66981]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345431.887224-327-25929236025461/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:13 np0005538960 python3.9[67133]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:14 np0005538960 python3.9[67256]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345433.3752341-372-141648288350933/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:15 np0005538960 python3.9[67408]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:57:15 np0005538960 systemd[1]: Reloading.
Nov 28 10:57:16 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:57:16 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:57:16 np0005538960 systemd[1]: Reloading.
Nov 28 10:57:16 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:57:16 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:57:16 np0005538960 systemd[1]: Starting EDPM Container Shutdown...
Nov 28 10:57:16 np0005538960 systemd[1]: Finished EDPM Container Shutdown.
Nov 28 10:57:17 np0005538960 python3.9[67635]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:18 np0005538960 python3.9[67758]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345436.752633-441-181321561143955/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:19 np0005538960 python3.9[67910]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:19 np0005538960 python3.9[68033]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345438.6473465-486-65634707607513/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:20 np0005538960 python3.9[68185]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:57:20 np0005538960 systemd[1]: Reloading.
Nov 28 10:57:20 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:57:20 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:57:22 np0005538960 systemd[1]: Reloading.
Nov 28 10:57:22 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:57:22 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:57:22 np0005538960 systemd[1]: Starting Create netns directory...
Nov 28 10:57:22 np0005538960 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 10:57:22 np0005538960 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 10:57:22 np0005538960 systemd[1]: Finished Create netns directory.
Nov 28 10:57:23 np0005538960 python3.9[68413]: ansible-ansible.builtin.service_facts Invoked
Nov 28 10:57:23 np0005538960 network[68430]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 10:57:23 np0005538960 network[68431]: 'network-scripts' will be removed from distribution in near future.
Nov 28 10:57:23 np0005538960 network[68432]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 10:57:28 np0005538960 python3.9[68694]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:57:29 np0005538960 systemd[1]: Reloading.
Nov 28 10:57:29 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:57:29 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:57:29 np0005538960 systemd[1]: Stopping IPv4 firewall with iptables...
Nov 28 10:57:29 np0005538960 iptables.init[68735]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Nov 28 10:57:29 np0005538960 iptables.init[68735]: iptables: Flushing firewall rules: [  OK  ]
Nov 28 10:57:29 np0005538960 systemd[1]: iptables.service: Deactivated successfully.
Nov 28 10:57:29 np0005538960 systemd[1]: Stopped IPv4 firewall with iptables.
Nov 28 10:57:30 np0005538960 python3.9[68932]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:57:32 np0005538960 python3.9[69086]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 10:57:32 np0005538960 systemd[1]: Reloading.
Nov 28 10:57:32 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 10:57:32 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 10:57:32 np0005538960 systemd[1]: Starting Netfilter Tables...
Nov 28 10:57:32 np0005538960 systemd[1]: Finished Netfilter Tables.
Nov 28 10:57:33 np0005538960 python3.9[69278]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:57:35 np0005538960 python3.9[69431]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:35 np0005538960 python3.9[69556]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345454.5004025-693-69029955653515/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:36 np0005538960 python3.9[69709]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 10:57:37 np0005538960 systemd[1]: Reloading OpenSSH server daemon...
Nov 28 10:57:37 np0005538960 systemd[1]: Reloaded OpenSSH server daemon.
Nov 28 10:57:38 np0005538960 python3.9[69865]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:39 np0005538960 python3.9[70017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:40 np0005538960 python3.9[70140]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345458.9576895-786-82339315860279/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:41 np0005538960 python3.9[70292]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Nov 28 10:57:41 np0005538960 systemd[1]: Starting Time & Date Service...
Nov 28 10:57:41 np0005538960 systemd[1]: Started Time & Date Service.
Nov 28 10:57:42 np0005538960 python3.9[70448]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:43 np0005538960 python3.9[70600]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:44 np0005538960 python3.9[70723]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345462.99447-891-216508578579487/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:45 np0005538960 python3.9[70875]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:45 np0005538960 python3.9[70998]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345464.5285003-936-270907401974950/.source.yaml _original_basename=.wdutk5iv follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:46 np0005538960 python3.9[71150]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:47 np0005538960 python3.9[71273]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345466.0935276-981-36427168283971/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:48 np0005538960 python3.9[71425]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:57:49 np0005538960 python3.9[71578]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:57:50 np0005538960 python3[71731]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 10:57:51 np0005538960 python3.9[71883]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:51 np0005538960 python3.9[72006]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345470.3820698-1098-168733428026949/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:52 np0005538960 python3.9[72158]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:53 np0005538960 python3.9[72281]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345472.181302-1143-238186728135099/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:54 np0005538960 python3.9[72433]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:54 np0005538960 python3.9[72556]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345473.7248635-1188-108360115629469/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:56 np0005538960 python3.9[72708]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:56 np0005538960 python3.9[72831]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345475.2073982-1233-247509921919062/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:57 np0005538960 python3.9[72983]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:57:58 np0005538960 python3.9[73106]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345477.0139966-1278-163956735565596/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:57:59 np0005538960 python3.9[73258]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:00 np0005538960 python3.9[73410]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:58:01 np0005538960 python3.9[73569]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:02 np0005538960 python3.9[73723]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:03 np0005538960 python3.9[73875]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:04 np0005538960 python3.9[74027]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 10:58:04 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:58:04 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 10:58:04 np0005538960 python3.9[74181]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Nov 28 10:58:05 np0005538960 systemd[1]: session-16.scope: Deactivated successfully.
Nov 28 10:58:05 np0005538960 systemd[1]: session-16.scope: Consumed 43.213s CPU time.
Nov 28 10:58:05 np0005538960 systemd-logind[788]: Session 16 logged out. Waiting for processes to exit.
Nov 28 10:58:05 np0005538960 systemd-logind[788]: Removed session 16.
Nov 28 10:58:10 np0005538960 systemd-logind[788]: New session 17 of user zuul.
Nov 28 10:58:10 np0005538960 systemd[1]: Started Session 17 of User zuul.
Nov 28 10:58:11 np0005538960 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 10:58:11 np0005538960 python3.9[74362]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Nov 28 10:58:12 np0005538960 python3.9[74516]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:58:14 np0005538960 python3.9[74668]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:58:15 np0005538960 python3.9[74820]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9rwKgUFfsmDzn/FxfWyVjpuUWZUeIN1jb2X0GYUdnyA0zc6kIqdf3pAL/FDBf6gcfgqdgdqi/4Orog+Gz9rxAjpvek9TMkLR/dP4zhDAhZlnpyDrMIc1C1BXc3JOfnEtPqdX4hCsZD6B2L01ABkPNIf9kWraUUti4M89T5DFL2XVFNMM2uO6fW05YV7TiRq7ZsFHg1iWP0/hI76lOvZ6VYR+fcob30k56UH4wPK/XTFzAHaJBHhvvn+IvqeCXuhrwHAdhKALiB0ERD2Zc4YyUkvNeSDLn8OW8r9FNDs/rr1m/v5elSLoJfXQEwr8RajSC/un5O48PbqPOyu0bfocM60REjf/TL8v0bKFZmDlR64La9w73cgLv/MZo/3icKwAL/cXNjaLAFYhxcHCQHH8m0w61j2t7sqdj91VWzdtKxNfr8GfinqWOriHL38+4tgA9u/yxAGq31NLfl5U0RRjKVLrbKGuLYUarJpqRtF6CIHGDwpnf0suPqVS71Vx/nUU=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIPhyM/0z5uOmJlO+lWqsei/bhw2tTBjNljR371n0yWpy#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBNKOZT5E/YvALQKcZFHMEFukZzYTd63fbRDabWsM3TG6ntX62UdZhGKBzOHG0KYloV6tFYToiE2FNgWqOBLWoJE=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3rAktSm0pab1OmkUURLdXQVa48kXBCPU94PZqFHR2aISrzSWhd5+Th2+Es2jYRLawTnTw/1L5jY3XyOOMlTSgBtREecAEgt3d2wampEr89f9JWPGTC8ZxQ/n4qB+MbuGwYtDdziqytuiNod53tXEZ6m2fURwJnB1nMe3X6Az9/68t1CCRWc7u0Fs9nePtKsyfIMQzVHXXCV+uoFIUT4mN4sNWykg1/vXK+cBY/PafCXigg9rKKaYYJgZhAcNvu/DR6vBkJ61kCQTtoa6ReVM356W3G7o7XdpCEZDgBGMxKroiwPTBeteDJh2CN+b9S5MMm2CQ666VN5d89fPdEWxzd+qaGF0VsC01c5g2MCBykUrZB0ZIk3mGOXuI6y2Tdr/JPTZwNiILuka3NAGOTyOamIvo+E+HQO9BTOusPjCJuj+nsCGf55JcY4RQessSlQZX8shO4kuS8FBcSSmxkMKbfoOtn+yqBzNiPUnV6mVeX0A0XcH+CUN7/oEva5GVRdc=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDdzqyToGi6D0cCO4REkgj/LPS3OpMLbkbS1wvrDi1Td#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBELBBFBw9t89BUFEREY0erl/eSxwauBum2tuFKPNna9ofVcQ6W+wGOn2Y5PXX51v2e/aOwaHf7CTxkXgSdk2yQY=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCeiXKrSNimg6ocEACi+DbGbBZbi1Pp6RdyUPyuXxHxLCDoerADtifnnF+CGMax+467libLGViV/DZP7gWRW+NDn2yRqrehYKr+/2HBmYjjZvzaOWciJ++sQOtn5ViXJ52MJNu+zoQW9NeDlnKoG1ON2wD47/YW3ORdHeK18X8sCS6KGP5PkY45kfyvgWExMlw7MVVetS5rqyqp8+SXFxwyOJzzDTh6VNWxz3EnkLDv/B1PWPLxGSwYv7QA1FNXplwU1PWAkOlyybnKoE0iO/uff0IcmXebw6vZx/ynTFxir6IJSrJyRMrCPVGzKXPfzve5De8RQl6F79Fi566gxDrCWjyMBaWr78JZ+WPkLVGp8+ekQr2JeA0BlCio6E9Nx3qVVaKpEYx0FtNqmZU7gPof8isJ9LLmSCa8o/4vzBT/Rd7UtLjZbfmayfIFOX1bmL65GjU9k19xcZBg+pEJbePe2LnBNXwXCGMLT5kLmMCr6xdvxdWydn4aLf5Dk0dUHSs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKjSZtQMo48qb9Jm3S7bcoiP5FkTucH2iGrGYjP3UKGo#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHE8CZFPC35LMxJ5T0xcYD5+BDovbYHhV082URysDC612ZX7cpQzgN0OnfH9EZfOoFvx+zWjvLPB5+swIIeIDjg=#012 create=True mode=0644 path=/tmp/ansible.ri1c_8s4 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:16 np0005538960 python3.9[74972]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ri1c_8s4' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:58:17 np0005538960 python3.9[75126]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ri1c_8s4 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:17 np0005538960 systemd[1]: session-17.scope: Deactivated successfully.
Nov 28 10:58:17 np0005538960 systemd[1]: session-17.scope: Consumed 4.120s CPU time.
Nov 28 10:58:17 np0005538960 systemd-logind[788]: Session 17 logged out. Waiting for processes to exit.
Nov 28 10:58:17 np0005538960 systemd-logind[788]: Removed session 17.
Nov 28 10:58:23 np0005538960 systemd-logind[788]: New session 18 of user zuul.
Nov 28 10:58:23 np0005538960 systemd[1]: Started Session 18 of User zuul.
Nov 28 10:58:24 np0005538960 python3.9[75304]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:58:25 np0005538960 python3.9[75460]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 10:58:26 np0005538960 python3.9[75614]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 10:58:27 np0005538960 python3.9[75767]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:58:28 np0005538960 python3.9[75920]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:58:29 np0005538960 python3.9[76074]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:58:30 np0005538960 python3.9[76229]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:30 np0005538960 systemd[1]: session-18.scope: Deactivated successfully.
Nov 28 10:58:30 np0005538960 systemd[1]: session-18.scope: Consumed 5.356s CPU time.
Nov 28 10:58:30 np0005538960 systemd-logind[788]: Session 18 logged out. Waiting for processes to exit.
Nov 28 10:58:30 np0005538960 systemd-logind[788]: Removed session 18.
Nov 28 10:58:36 np0005538960 systemd-logind[788]: New session 19 of user zuul.
Nov 28 10:58:36 np0005538960 systemd[1]: Started Session 19 of User zuul.
Nov 28 10:58:37 np0005538960 python3.9[76407]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:58:38 np0005538960 python3.9[76563]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:58:39 np0005538960 python3.9[76647]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Nov 28 10:58:41 np0005538960 python3.9[76798]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 10:58:43 np0005538960 python3.9[76949]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 10:58:44 np0005538960 python3.9[77099]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:58:44 np0005538960 python3.9[77249]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 10:58:45 np0005538960 systemd[1]: session-19.scope: Deactivated successfully.
Nov 28 10:58:45 np0005538960 systemd[1]: session-19.scope: Consumed 6.899s CPU time.
Nov 28 10:58:45 np0005538960 systemd-logind[788]: Session 19 logged out. Waiting for processes to exit.
Nov 28 10:58:45 np0005538960 systemd-logind[788]: Removed session 19.
Nov 28 10:58:51 np0005538960 systemd-logind[788]: New session 20 of user zuul.
Nov 28 10:58:51 np0005538960 systemd[1]: Started Session 20 of User zuul.
Nov 28 10:58:52 np0005538960 python3.9[77427]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:58:54 np0005538960 python3.9[77583]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:58:55 np0005538960 python3.9[77735]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:58:55 np0005538960 python3.9[77887]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:58:56 np0005538960 python3.9[78010]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345535.427215-160-11561738264441/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=191df1c8b1ddd9cd09a18221c5355a8bbf5b9b46 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:57 np0005538960 python3.9[78162]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:58:58 np0005538960 python3.9[78285]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345536.8614657-160-122443500308421/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=0430b386f5bbf11f37e60c2d93794313e3eacc0b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:58:58 np0005538960 python3.9[78437]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:58:59 np0005538960 python3.9[78560]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345538.2134037-160-56976748101349/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=886c97f139a03b14402d56d82493d2b55a44ef93 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:00 np0005538960 python3.9[78712]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:00 np0005538960 python3.9[78864]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:01 np0005538960 python3.9[79016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:02 np0005538960 python3.9[79139]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345541.1919394-335-117256803631927/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=b73829146669ccc4b1192d907623431b4e1598d9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:03 np0005538960 python3.9[79291]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:03 np0005538960 python3.9[79414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345542.5977175-335-107432240293646/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=8295b1b5d6e46bb8d48440a857f83ad0c0b80e25 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:04 np0005538960 chronyd[64989]: Selected source 206.108.0.131 (pool.ntp.org)
Nov 28 10:59:04 np0005538960 python3.9[79566]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:05 np0005538960 python3.9[79689]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345544.0633786-335-271192924185717/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=becc72404ac841bdb7eb15fb58fc2ff967195b1a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:06 np0005538960 python3.9[79841]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:06 np0005538960 python3.9[79993]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:07 np0005538960 python3.9[80145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:08 np0005538960 python3.9[80268]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345547.1623042-521-130182594207515/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=6d92453f8607493762e1563f4269db1026917f9e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:09 np0005538960 python3.9[80420]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:09 np0005538960 python3.9[80543]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345548.5246384-521-15475570988135/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=ceca04d634ba34e51e2dba2ad2ce7ad47c1ce5f1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:10 np0005538960 python3.9[80695]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:11 np0005538960 python3.9[80818]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345549.896274-521-12594679958897/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=d5d6d2b6a10a9f4e5a0f3c326aaaecd7d514116a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:11 np0005538960 python3.9[80970]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:12 np0005538960 python3.9[81122]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:13 np0005538960 python3.9[81274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:14 np0005538960 python3.9[81397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345552.881536-695-230400756714692/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=a8308b0a694375dcd17aa342214dfb4aaa1c79f7 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:14 np0005538960 python3.9[81549]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:15 np0005538960 python3.9[81672]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345554.3686447-695-10211954400949/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=ceca04d634ba34e51e2dba2ad2ce7ad47c1ce5f1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:16 np0005538960 python3.9[81824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:17 np0005538960 python3.9[81947]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345555.8238158-695-100986912425052/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=47c21d3798009611673aca073e9f0cf2fdc896c1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:18 np0005538960 python3.9[82099]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:19 np0005538960 python3.9[82251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:19 np0005538960 python3.9[82374]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345558.7441034-910-80030118993452/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eedf92a785b2ae02f8a70da1ae583bf08644d85e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:20 np0005538960 python3.9[82526]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:22 np0005538960 python3.9[82679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:23 np0005538960 python3.9[82802]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345561.1952949-984-34974962464490/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eedf92a785b2ae02f8a70da1ae583bf08644d85e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:24 np0005538960 python3.9[82954]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:25 np0005538960 python3.9[83106]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:25 np0005538960 python3.9[83229]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345564.5240026-1073-186651701456024/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eedf92a785b2ae02f8a70da1ae583bf08644d85e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:26 np0005538960 python3.9[83381]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:27 np0005538960 python3.9[83533]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:28 np0005538960 python3.9[83656]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345567.0780203-1149-146219217929065/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eedf92a785b2ae02f8a70da1ae583bf08644d85e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:29 np0005538960 python3.9[83808]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:30 np0005538960 python3.9[83960]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:30 np0005538960 python3.9[84083]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345569.4832497-1223-161399013822271/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eedf92a785b2ae02f8a70da1ae583bf08644d85e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:31 np0005538960 python3.9[84235]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:32 np0005538960 python3.9[84387]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:33 np0005538960 python3.9[84510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345571.9192903-1296-212192601001623/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eedf92a785b2ae02f8a70da1ae583bf08644d85e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:33 np0005538960 python3.9[84662]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:34 np0005538960 python3.9[84814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:35 np0005538960 python3.9[84937]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345574.202034-1344-226866912780120/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=eedf92a785b2ae02f8a70da1ae583bf08644d85e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:35 np0005538960 systemd[1]: session-20.scope: Deactivated successfully.
Nov 28 10:59:35 np0005538960 systemd-logind[788]: Session 20 logged out. Waiting for processes to exit.
Nov 28 10:59:35 np0005538960 systemd[1]: session-20.scope: Consumed 34.826s CPU time.
Nov 28 10:59:35 np0005538960 systemd-logind[788]: Removed session 20.
Nov 28 10:59:40 np0005538960 systemd-logind[788]: New session 21 of user zuul.
Nov 28 10:59:40 np0005538960 systemd[1]: Started Session 21 of User zuul.
Nov 28 10:59:42 np0005538960 python3.9[85115]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:59:43 np0005538960 python3.9[85271]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:44 np0005538960 python3.9[85423]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 10:59:45 np0005538960 python3.9[85573]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 10:59:46 np0005538960 python3.9[85725]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 10:59:48 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Nov 28 10:59:48 np0005538960 python3.9[85881]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 10:59:49 np0005538960 python3.9[85965]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 10:59:52 np0005538960 python3.9[86118]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 10:59:53 np0005538960 python3[86273]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Nov 28 10:59:54 np0005538960 python3.9[86425]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:55 np0005538960 python3.9[86577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:56 np0005538960 python3.9[86655]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:57 np0005538960 python3.9[86809]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:57 np0005538960 python3.9[86887]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.jil6s4hb recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 10:59:58 np0005538960 python3.9[87039]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 10:59:59 np0005538960 python3.9[87117]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:00 np0005538960 python3.9[87269]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:00:02 np0005538960 python3[87422]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 11:00:02 np0005538960 python3.9[87574]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:04 np0005538960 python3.9[87699]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345602.435687-432-60365935734625/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:05 np0005538960 python3.9[87851]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:06 np0005538960 python3.9[87976]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345604.5948002-477-231502687380747/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:06 np0005538960 python3.9[88128]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:07 np0005538960 python3.9[88253]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345606.3733928-522-174567211259448/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:08 np0005538960 python3.9[88405]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:09 np0005538960 python3.9[88530]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345608.03351-567-164627714780290/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:10 np0005538960 python3.9[88682]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:11 np0005538960 python3.9[88807]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345610.0476627-612-259129372317651/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:13 np0005538960 python3.9[88959]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:14 np0005538960 python3.9[89111]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:00:15 np0005538960 python3.9[89266]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:16 np0005538960 python3.9[89418]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:00:17 np0005538960 python3.9[89571]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:00:18 np0005538960 python3.9[89725]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:00:18 np0005538960 python3.9[89880]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:20 np0005538960 python3.9[90030]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 11:00:21 np0005538960 python3.9[90183]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:c6:22:5a:f7" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:00:22 np0005538960 ovs-vsctl[90184]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:c6:22:5a:f7 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Nov 28 11:00:23 np0005538960 python3.9[90336]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:00:24 np0005538960 python3.9[90491]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:00:24 np0005538960 ovs-vsctl[90492]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Nov 28 11:00:25 np0005538960 python3.9[90642]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:00:26 np0005538960 python3.9[90796]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:00:27 np0005538960 python3.9[90948]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:28 np0005538960 python3.9[91026]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:00:28 np0005538960 python3.9[91178]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:29 np0005538960 python3.9[91256]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:00:30 np0005538960 python3.9[91408]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:30 np0005538960 python3.9[91560]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:31 np0005538960 python3.9[91638]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:32 np0005538960 python3.9[91790]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:32 np0005538960 python3.9[91868]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:33 np0005538960 python3.9[92020]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:00:33 np0005538960 systemd[1]: Reloading.
Nov 28 11:00:33 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:00:33 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:00:34 np0005538960 python3.9[92210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:35 np0005538960 python3.9[92288]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:36 np0005538960 python3.9[92440]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:36 np0005538960 python3.9[92518]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:37 np0005538960 python3.9[92670]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:00:37 np0005538960 systemd[1]: Reloading.
Nov 28 11:00:37 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:00:37 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:00:37 np0005538960 systemd[1]: Starting Create netns directory...
Nov 28 11:00:38 np0005538960 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 11:00:38 np0005538960 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 11:00:38 np0005538960 systemd[1]: Finished Create netns directory.
Nov 28 11:00:38 np0005538960 python3.9[92863]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:00:39 np0005538960 python3.9[93015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:40 np0005538960 python3.9[93138]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345639.1789572-1365-258524196428863/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:00:41 np0005538960 python3.9[93290]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:00:42 np0005538960 python3.9[93442]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:00:42 np0005538960 python3.9[93565]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345641.759163-1440-127790696907443/.source.json _original_basename=.nxtudfn1 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:43 np0005538960 python3.9[93717]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:46 np0005538960 python3.9[94144]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Nov 28 11:00:47 np0005538960 python3.9[94296]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:00:50 np0005538960 python3.9[94448]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 11:00:50 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 11:00:52 np0005538960 python3[94612]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:00:52 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 11:00:52 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 11:00:52 np0005538960 podman[94648]: 2025-11-28 16:00:52.406975168 +0000 UTC m=+0.075650287 container create 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Nov 28 11:00:52 np0005538960 podman[94648]: 2025-11-28 16:00:52.36977653 +0000 UTC m=+0.038451699 image pull 52cb1910f3f090372807028d1c2aea98d2557b1086636469529f290368ecdf69 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 11:00:52 np0005538960 python3[94612]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Nov 28 11:00:53 np0005538960 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Nov 28 11:00:53 np0005538960 python3.9[94835]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:00:54 np0005538960 python3.9[94989]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:54 np0005538960 python3.9[95065]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:00:55 np0005538960 python3.9[95216]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764345654.931446-1704-239912447202106/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:00:56 np0005538960 python3.9[95292]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:00:56 np0005538960 systemd[1]: Reloading.
Nov 28 11:00:56 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:00:56 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:00:57 np0005538960 python3.9[95404]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:00:57 np0005538960 systemd[1]: Reloading.
Nov 28 11:00:57 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:00:57 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:00:57 np0005538960 systemd[1]: Starting ovn_controller container...
Nov 28 11:00:57 np0005538960 systemd[1]: Created slice Virtual Machine and Container Slice.
Nov 28 11:00:57 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:00:57 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6c71210edbc01738d747450925547259efd11b4323ab814fd3054ece27700bf/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 11:00:57 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707.
Nov 28 11:00:57 np0005538960 podman[95445]: 2025-11-28 16:00:57.732821622 +0000 UTC m=+0.173761221 container init 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:00:57 np0005538960 ovn_controller[95460]: + sudo -E kolla_set_configs
Nov 28 11:00:57 np0005538960 podman[95445]: 2025-11-28 16:00:57.76225293 +0000 UTC m=+0.203192439 container start 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:00:57 np0005538960 edpm-start-podman-container[95445]: ovn_controller
Nov 28 11:00:57 np0005538960 systemd[1]: Created slice User Slice of UID 0.
Nov 28 11:00:57 np0005538960 systemd[1]: Starting User Runtime Directory /run/user/0...
Nov 28 11:00:57 np0005538960 systemd[1]: Finished User Runtime Directory /run/user/0.
Nov 28 11:00:57 np0005538960 systemd[1]: Starting User Manager for UID 0...
Nov 28 11:00:57 np0005538960 edpm-start-podman-container[95444]: Creating additional drop-in dependency for "ovn_controller" (36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707)
Nov 28 11:00:57 np0005538960 systemd[1]: Reloading.
Nov 28 11:00:57 np0005538960 podman[95467]: 2025-11-28 16:00:57.871353982 +0000 UTC m=+0.095975833 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:00:57 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:00:57 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:00:57 np0005538960 systemd[95498]: Queued start job for default target Main User Target.
Nov 28 11:00:57 np0005538960 systemd[95498]: Created slice User Application Slice.
Nov 28 11:00:57 np0005538960 systemd[95498]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Nov 28 11:00:57 np0005538960 systemd[95498]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 11:00:57 np0005538960 systemd[95498]: Reached target Paths.
Nov 28 11:00:57 np0005538960 systemd[95498]: Reached target Timers.
Nov 28 11:00:57 np0005538960 systemd[95498]: Starting D-Bus User Message Bus Socket...
Nov 28 11:00:57 np0005538960 systemd[95498]: Starting Create User's Volatile Files and Directories...
Nov 28 11:00:57 np0005538960 systemd[95498]: Listening on D-Bus User Message Bus Socket.
Nov 28 11:00:57 np0005538960 systemd[95498]: Finished Create User's Volatile Files and Directories.
Nov 28 11:00:57 np0005538960 systemd[95498]: Reached target Sockets.
Nov 28 11:00:57 np0005538960 systemd[95498]: Reached target Basic System.
Nov 28 11:00:57 np0005538960 systemd[95498]: Reached target Main User Target.
Nov 28 11:00:57 np0005538960 systemd[95498]: Startup finished in 137ms.
Nov 28 11:00:58 np0005538960 systemd[1]: Started User Manager for UID 0.
Nov 28 11:00:58 np0005538960 systemd[1]: Started ovn_controller container.
Nov 28 11:00:58 np0005538960 systemd[1]: 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707-57e420bb4a23d8d3.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:00:58 np0005538960 systemd[1]: 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707-57e420bb4a23d8d3.service: Failed with result 'exit-code'.
Nov 28 11:00:58 np0005538960 systemd[1]: Started Session c1 of User root.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: INFO:__main__:Validating config file
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: INFO:__main__:Writing out command to execute
Nov 28 11:00:58 np0005538960 systemd[1]: session-c1.scope: Deactivated successfully.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: ++ cat /run_command
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: + ARGS=
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: + sudo kolla_copy_cacerts
Nov 28 11:00:58 np0005538960 systemd[1]: Started Session c2 of User root.
Nov 28 11:00:58 np0005538960 systemd[1]: session-c2.scope: Deactivated successfully.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: + [[ ! -n '' ]]
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: + . kolla_extend_start
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: + umask 0022
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.2842] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.2853] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.2865] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.2873] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.2878] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 28 11:00:58 np0005538960 kernel: br-int: entered promiscuous mode
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00014|main|INFO|OVS feature set changed, force recompute.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00022|main|INFO|OVS feature set changed, force recompute.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.3091] manager: (ovn-1218ff-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Nov 28 11:00:58 np0005538960 kernel: genev_sys_6081: entered promiscuous mode
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.3300] device (genev_sys_6081): carrier: link connected
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.3303] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Nov 28 11:00:58 np0005538960 systemd-udevd[95600]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:00:58 np0005538960 ovn_controller[95460]: 2025-11-28T16:00:58Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Nov 28 11:00:58 np0005538960 systemd-udevd[95604]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.4205] manager: (ovn-fc8108-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Nov 28 11:00:58 np0005538960 NetworkManager[55548]: <info>  [1764345658.5069] manager: (ovn-53e209-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Nov 28 11:00:59 np0005538960 python3.9[95734]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:00:59 np0005538960 ovs-vsctl[95735]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Nov 28 11:01:00 np0005538960 python3.9[95887]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:01:00 np0005538960 ovs-vsctl[95889]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Nov 28 11:01:01 np0005538960 python3.9[96042]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:01:01 np0005538960 ovs-vsctl[96043]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Nov 28 11:01:01 np0005538960 systemd[1]: session-21.scope: Deactivated successfully.
Nov 28 11:01:01 np0005538960 systemd[1]: session-21.scope: Consumed 52.491s CPU time.
Nov 28 11:01:01 np0005538960 systemd-logind[788]: Session 21 logged out. Waiting for processes to exit.
Nov 28 11:01:01 np0005538960 systemd-logind[788]: Removed session 21.
Nov 28 11:01:08 np0005538960 systemd[1]: Stopping User Manager for UID 0...
Nov 28 11:01:08 np0005538960 systemd[95498]: Activating special unit Exit the Session...
Nov 28 11:01:08 np0005538960 systemd[95498]: Stopped target Main User Target.
Nov 28 11:01:08 np0005538960 systemd[95498]: Stopped target Basic System.
Nov 28 11:01:08 np0005538960 systemd[95498]: Stopped target Paths.
Nov 28 11:01:08 np0005538960 systemd[95498]: Stopped target Sockets.
Nov 28 11:01:08 np0005538960 systemd[95498]: Stopped target Timers.
Nov 28 11:01:08 np0005538960 systemd[95498]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 11:01:08 np0005538960 systemd[95498]: Closed D-Bus User Message Bus Socket.
Nov 28 11:01:08 np0005538960 systemd[95498]: Stopped Create User's Volatile Files and Directories.
Nov 28 11:01:08 np0005538960 systemd[95498]: Removed slice User Application Slice.
Nov 28 11:01:08 np0005538960 systemd[95498]: Reached target Shutdown.
Nov 28 11:01:08 np0005538960 systemd[95498]: Finished Exit the Session.
Nov 28 11:01:08 np0005538960 systemd[95498]: Reached target Exit the Session.
Nov 28 11:01:08 np0005538960 systemd[1]: user@0.service: Deactivated successfully.
Nov 28 11:01:08 np0005538960 systemd[1]: Stopped User Manager for UID 0.
Nov 28 11:01:08 np0005538960 systemd[1]: Stopping User Runtime Directory /run/user/0...
Nov 28 11:01:08 np0005538960 systemd[1]: run-user-0.mount: Deactivated successfully.
Nov 28 11:01:08 np0005538960 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Nov 28 11:01:08 np0005538960 systemd[1]: Stopped User Runtime Directory /run/user/0.
Nov 28 11:01:08 np0005538960 systemd[1]: Removed slice User Slice of UID 0.
Nov 28 11:01:08 np0005538960 systemd-logind[788]: New session 23 of user zuul.
Nov 28 11:01:08 np0005538960 systemd[1]: Started Session 23 of User zuul.
Nov 28 11:01:09 np0005538960 python3.9[96238]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 11:01:11 np0005538960 python3.9[96394]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:12 np0005538960 python3.9[96546]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:13 np0005538960 python3.9[96698]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:14 np0005538960 python3.9[96850]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:15 np0005538960 python3.9[97002]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:15 np0005538960 python3.9[97152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 11:01:16 np0005538960 python3.9[97305]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Nov 28 11:01:18 np0005538960 python3.9[97455]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:19 np0005538960 python3.9[97576]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345677.7859874-219-164744896102506/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:20 np0005538960 python3.9[97726]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:20 np0005538960 python3.9[97847]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345679.4607642-264-88457461132124/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:21 np0005538960 python3.9[97999]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 11:01:22 np0005538960 python3.9[98083]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 11:01:25 np0005538960 python3.9[98236]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 11:01:26 np0005538960 python3.9[98389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:26 np0005538960 python3.9[98510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345685.630188-375-100220457655743/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:27 np0005538960 python3.9[98660]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:28 np0005538960 ovn_controller[95460]: 2025-11-28T16:01:28Z|00025|memory|INFO|16000 kB peak resident set size after 30.0 seconds
Nov 28 11:01:28 np0005538960 ovn_controller[95460]: 2025-11-28T16:01:28Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Nov 28 11:01:28 np0005538960 podman[98755]: 2025-11-28 16:01:28.314630875 +0000 UTC m=+0.140524280 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 11:01:28 np0005538960 python3.9[98792]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345687.1321983-375-99529280044672/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:30 np0005538960 python3.9[98957]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:30 np0005538960 python3.9[99078]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345689.4432113-507-266694722136805/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:31 np0005538960 python3.9[99228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:32 np0005538960 python3.9[99349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345691.0326083-507-57443088725219/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:32 np0005538960 python3.9[99499]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:01:34 np0005538960 python3.9[99653]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:34 np0005538960 python3.9[99805]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:35 np0005538960 python3.9[99883]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:36 np0005538960 python3.9[100035]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:36 np0005538960 python3.9[100113]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:37 np0005538960 python3.9[100265]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:01:38 np0005538960 python3.9[100417]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:38 np0005538960 python3.9[100495]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:01:39 np0005538960 python3.9[100647]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:40 np0005538960 python3.9[100725]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:01:41 np0005538960 python3.9[100877]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:01:41 np0005538960 systemd[1]: Reloading.
Nov 28 11:01:41 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:01:41 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:01:42 np0005538960 python3.9[101066]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:43 np0005538960 python3.9[101144]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:01:43 np0005538960 python3.9[101296]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:44 np0005538960 python3.9[101374]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:01:45 np0005538960 python3.9[101526]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:01:45 np0005538960 systemd[1]: Reloading.
Nov 28 11:01:45 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:01:45 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:01:45 np0005538960 systemd[1]: Starting Create netns directory...
Nov 28 11:01:45 np0005538960 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 11:01:45 np0005538960 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 11:01:45 np0005538960 systemd[1]: Finished Create netns directory.
Nov 28 11:01:46 np0005538960 python3.9[101720]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:47 np0005538960 python3.9[101872]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:48 np0005538960 python3.9[101995]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764345707.0594923-961-112644606702651/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:49 np0005538960 python3.9[102147]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:01:50 np0005538960 python3.9[102299]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:01:51 np0005538960 python3.9[102422]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764345709.6721666-1035-26951070481488/.source.json _original_basename=.d2dvwjnm follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:01:52 np0005538960 python3.9[102574]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:01:54 np0005538960 python3.9[103001]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Nov 28 11:01:55 np0005538960 python3.9[103153]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:01:56 np0005538960 python3.9[103305]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 11:01:58 np0005538960 python3[103483]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:01:58 np0005538960 podman[103522]: 2025-11-28 16:01:58.853729922 +0000 UTC m=+0.088107409 container create a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:01:58 np0005538960 podman[103522]: 2025-11-28 16:01:58.810644638 +0000 UTC m=+0.045022115 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:01:58 np0005538960 python3[103483]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:01:59 np0005538960 podman[103562]: 2025-11-28 16:01:59.263516199 +0000 UTC m=+0.151242527 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:01:59 np0005538960 python3.9[103741]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:02:01 np0005538960 python3.9[103895]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:01 np0005538960 python3.9[103971]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:02:02 np0005538960 python3.9[104122]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764345721.7430286-1299-238976227978459/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:02 np0005538960 python3.9[104198]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:02:02 np0005538960 systemd[1]: Reloading.
Nov 28 11:02:03 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:02:03 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:02:03 np0005538960 python3.9[104309]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:02:03 np0005538960 systemd[1]: Reloading.
Nov 28 11:02:03 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:02:03 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:02:04 np0005538960 systemd[1]: Starting ovn_metadata_agent container...
Nov 28 11:02:04 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:02:04 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75f898d42055d9d2d5822e990531968d87621b93f0b2cd4ad533acbdcfd855e3/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Nov 28 11:02:04 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/75f898d42055d9d2d5822e990531968d87621b93f0b2cd4ad533acbdcfd855e3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:02:04 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4.
Nov 28 11:02:04 np0005538960 podman[104349]: 2025-11-28 16:02:04.344684786 +0000 UTC m=+0.190328165 container init a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + sudo -E kolla_set_configs
Nov 28 11:02:04 np0005538960 podman[104349]: 2025-11-28 16:02:04.368535182 +0000 UTC m=+0.214178561 container start a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 11:02:04 np0005538960 edpm-start-podman-container[104349]: ovn_metadata_agent
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Validating config file
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Copying service configuration files
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Writing out command to execute
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Setting permission for /var/lib/neutron
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Setting permission for /var/lib/neutron/external
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Nov 28 11:02:04 np0005538960 edpm-start-podman-container[104348]: Creating additional drop-in dependency for "ovn_metadata_agent" (a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4)
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: ++ cat /run_command
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + CMD=neutron-ovn-metadata-agent
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + ARGS=
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + sudo kolla_copy_cacerts
Nov 28 11:02:04 np0005538960 systemd[1]: Reloading.
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + [[ ! -n '' ]]
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + . kolla_extend_start
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: Running command: 'neutron-ovn-metadata-agent'
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + umask 0022
Nov 28 11:02:04 np0005538960 ovn_metadata_agent[104364]: + exec neutron-ovn-metadata-agent
Nov 28 11:02:04 np0005538960 podman[104371]: 2025-11-28 16:02:04.469850919 +0000 UTC m=+0.087287011 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Nov 28 11:02:04 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:02:04 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:02:04 np0005538960 systemd[1]: Started ovn_metadata_agent container.
Nov 28 11:02:05 np0005538960 systemd-logind[788]: Session 23 logged out. Waiting for processes to exit.
Nov 28 11:02:05 np0005538960 systemd[1]: session-23.scope: Deactivated successfully.
Nov 28 11:02:05 np0005538960 systemd[1]: session-23.scope: Consumed 40.959s CPU time.
Nov 28 11:02:05 np0005538960 systemd-logind[788]: Removed session 23.
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.286 104369 INFO neutron.common.config [-] Logging enabled!#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.287 104369 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.287 104369 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.287 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.287 104369 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.287 104369 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.288 104369 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.289 104369 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.290 104369 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.291 104369 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.292 104369 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.293 104369 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.294 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.295 104369 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.296 104369 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.297 104369 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.298 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.299 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.300 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.301 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.302 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.303 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.304 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.305 104369 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.306 104369 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.307 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.308 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.309 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.310 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.311 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.312 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.313 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.314 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.314 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.314 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.314 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.314 104369 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.314 104369 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.314 104369 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.314 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.315 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.315 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.315 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.315 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.315 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.315 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.315 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.315 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.316 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.317 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.318 104369 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.370 104369 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.370 104369 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.370 104369 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.370 104369 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.371 104369 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.383 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name ac0d1e81-02b2-487b-bc65-46ccb331e9e4 (UUID: ac0d1e81-02b2-487b-bc65-46ccb331e9e4) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.422 104369 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.422 104369 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.422 104369 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.422 104369 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.434 104369 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.442 104369 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.449 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'ac0d1e81-02b2-487b-bc65-46ccb331e9e4'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], external_ids={}, name=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, nb_cfg_timestamp=1764345666309, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.451 104369 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f087f1f7a90>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.452 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.452 104369 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.452 104369 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.453 104369 INFO oslo_service.service [-] Starting 1 workers#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.459 104369 DEBUG oslo_service.service [-] Started child 104477 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.465 104369 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpu67eyl_6/privsep.sock']#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.466 104477 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-365185'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.498 104477 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.498 104477 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.499 104477 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.502 104477 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.510 104477 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Nov 28 11:02:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:06.517 104477 INFO eventlet.wsgi.server [-] (104477) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Nov 28 11:02:07 np0005538960 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.180 104369 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.181 104369 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpu67eyl_6/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.041 104482 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.046 104482 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.049 104482 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.049 104482 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104482#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.185 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[65acc29a-33ba-4f5b-ad2f-8e5cd7dccbc4]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.677 104482 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.677 104482 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:02:07 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:07.677 104482 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.215 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[5cd74c50-76e8-4ecd-a209-5c0a06dcee12]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.220 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, column=external_ids, values=({'neutron:ovn-metadata-id': 'a57c4959-29a8-5644-ae98-911c97f2e16f'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.232 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.245 104369 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.246 104369 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.246 104369 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.246 104369 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.246 104369 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.246 104369 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.247 104369 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.247 104369 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.247 104369 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.248 104369 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.248 104369 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.248 104369 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.248 104369 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.249 104369 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.249 104369 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.249 104369 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.250 104369 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.250 104369 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.250 104369 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.250 104369 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.251 104369 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.251 104369 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.251 104369 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.251 104369 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.252 104369 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.252 104369 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.253 104369 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.253 104369 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.253 104369 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.254 104369 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.254 104369 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.254 104369 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.254 104369 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.255 104369 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.255 104369 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.255 104369 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.256 104369 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.256 104369 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.256 104369 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.257 104369 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.257 104369 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.257 104369 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.257 104369 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.258 104369 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.258 104369 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.258 104369 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.258 104369 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.258 104369 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.259 104369 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.259 104369 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.259 104369 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.260 104369 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.260 104369 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.260 104369 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.260 104369 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.261 104369 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.261 104369 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.261 104369 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.261 104369 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.262 104369 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.262 104369 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.262 104369 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.262 104369 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.263 104369 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.263 104369 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.263 104369 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.263 104369 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.264 104369 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.264 104369 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.264 104369 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.264 104369 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.265 104369 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.265 104369 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.265 104369 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.265 104369 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.266 104369 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.266 104369 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.266 104369 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.266 104369 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.267 104369 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.267 104369 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.267 104369 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.268 104369 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.268 104369 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.268 104369 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.269 104369 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.269 104369 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.269 104369 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.270 104369 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.270 104369 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.270 104369 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.270 104369 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.270 104369 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.271 104369 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.271 104369 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.271 104369 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.271 104369 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.272 104369 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.272 104369 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.272 104369 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.272 104369 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.273 104369 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.273 104369 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.273 104369 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.273 104369 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.274 104369 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.274 104369 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.274 104369 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.275 104369 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.275 104369 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.275 104369 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.276 104369 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.276 104369 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.276 104369 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.277 104369 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.277 104369 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.277 104369 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.277 104369 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.278 104369 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.278 104369 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.278 104369 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.279 104369 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.279 104369 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.279 104369 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.280 104369 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.280 104369 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.280 104369 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.280 104369 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.281 104369 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.281 104369 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.281 104369 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.281 104369 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.282 104369 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.282 104369 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.282 104369 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.283 104369 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.283 104369 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.283 104369 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.284 104369 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.284 104369 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.284 104369 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.285 104369 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.285 104369 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.285 104369 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.286 104369 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.286 104369 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.286 104369 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.286 104369 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.287 104369 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.287 104369 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.287 104369 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.288 104369 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.288 104369 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.288 104369 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.289 104369 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.289 104369 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.289 104369 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.289 104369 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.289 104369 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.289 104369 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.289 104369 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.290 104369 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.290 104369 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.290 104369 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.290 104369 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.290 104369 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.291 104369 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.291 104369 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.291 104369 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.291 104369 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.291 104369 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.291 104369 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.292 104369 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.292 104369 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.292 104369 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.292 104369 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.292 104369 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.292 104369 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.293 104369 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.293 104369 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.293 104369 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.293 104369 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.293 104369 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.293 104369 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.294 104369 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.294 104369 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.294 104369 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.294 104369 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.294 104369 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.294 104369 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.294 104369 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.295 104369 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.295 104369 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.295 104369 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.295 104369 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.295 104369 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.295 104369 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.295 104369 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.296 104369 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.296 104369 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.296 104369 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.296 104369 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.296 104369 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.296 104369 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.296 104369 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.297 104369 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.297 104369 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.297 104369 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.297 104369 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.297 104369 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.297 104369 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.297 104369 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.298 104369 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.298 104369 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.298 104369 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.298 104369 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.298 104369 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.298 104369 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.298 104369 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.299 104369 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.299 104369 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.299 104369 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.299 104369 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.299 104369 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.299 104369 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.299 104369 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.300 104369 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.300 104369 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.300 104369 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.300 104369 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.300 104369 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.300 104369 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.300 104369 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.301 104369 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.301 104369 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.301 104369 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.301 104369 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.301 104369 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.301 104369 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.301 104369 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.302 104369 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.302 104369 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.302 104369 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.302 104369 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.302 104369 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.302 104369 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.303 104369 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.303 104369 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.303 104369 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.303 104369 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.303 104369 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.303 104369 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.303 104369 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.304 104369 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.304 104369 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.304 104369 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.304 104369 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.304 104369 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.304 104369 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.305 104369 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.305 104369 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.305 104369 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.305 104369 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.305 104369 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.305 104369 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.305 104369 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.306 104369 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.306 104369 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.306 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.306 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.306 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.306 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.307 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.307 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.307 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.307 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.307 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.307 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.307 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.308 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.308 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.308 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.308 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.308 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.308 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.309 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.309 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.309 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.309 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.309 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.309 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.309 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.310 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.310 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.310 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.310 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.310 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.310 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.311 104369 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.311 104369 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.311 104369 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.311 104369 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.311 104369 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:02:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:02:08.311 104369 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 28 11:02:11 np0005538960 systemd-logind[788]: New session 24 of user zuul.
Nov 28 11:02:11 np0005538960 systemd[1]: Started Session 24 of User zuul.
Nov 28 11:02:12 np0005538960 python3.9[104640]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 11:02:13 np0005538960 python3.9[104796]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:15 np0005538960 python3.9[104961]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:02:15 np0005538960 systemd[1]: Reloading.
Nov 28 11:02:15 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:02:15 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:02:16 np0005538960 python3.9[105145]: ansible-ansible.builtin.service_facts Invoked
Nov 28 11:02:16 np0005538960 network[105162]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 11:02:16 np0005538960 network[105163]: 'network-scripts' will be removed from distribution in near future.
Nov 28 11:02:16 np0005538960 network[105164]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 11:02:23 np0005538960 python3.9[105425]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:02:25 np0005538960 python3.9[105578]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:02:25 np0005538960 python3.9[105731]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:02:26 np0005538960 python3.9[105884]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:02:27 np0005538960 python3.9[106037]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:02:28 np0005538960 python3.9[106190]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:02:29 np0005538960 podman[106343]: 2025-11-28 16:02:29.47400074 +0000 UTC m=+0.158605475 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:02:29 np0005538960 python3.9[106344]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:02:31 np0005538960 python3.9[106524]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:32 np0005538960 python3.9[106676]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:33 np0005538960 python3.9[106828]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:33 np0005538960 python3.9[106980]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:34 np0005538960 python3.9[107132]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:35 np0005538960 podman[107256]: 2025-11-28 16:02:35.053343305 +0000 UTC m=+0.090055403 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:02:35 np0005538960 python3.9[107303]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:35 np0005538960 python3.9[107455]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:36 np0005538960 python3.9[107607]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:37 np0005538960 python3.9[107759]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:38 np0005538960 python3.9[107911]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:38 np0005538960 python3.9[108063]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:39 np0005538960 python3.9[108215]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:40 np0005538960 python3.9[108367]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:41 np0005538960 python3.9[108519]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:02:42 np0005538960 python3.9[108671]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:43 np0005538960 python3.9[108823]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 11:02:44 np0005538960 python3.9[108975]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:02:44 np0005538960 systemd[1]: Reloading.
Nov 28 11:02:44 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:02:44 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:02:45 np0005538960 python3.9[109163]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:46 np0005538960 python3.9[109316]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:47 np0005538960 python3.9[109469]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:47 np0005538960 python3.9[109622]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:48 np0005538960 python3.9[109775]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:49 np0005538960 python3.9[109928]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:50 np0005538960 python3.9[110081]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:02:51 np0005538960 python3.9[110234]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Nov 28 11:02:52 np0005538960 python3.9[110387]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 11:02:54 np0005538960 python3.9[110545]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 11:02:55 np0005538960 python3.9[110705]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 11:02:56 np0005538960 python3.9[110789]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 11:03:00 np0005538960 podman[110800]: 2025-11-28 16:03:00.24366406 +0000 UTC m=+0.141253059 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:03:05 np0005538960 podman[110827]: 2025-11-28 16:03:05.181189343 +0000 UTC m=+0.073329470 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 11:03:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:03:06.321 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:03:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:03:06.322 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:03:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:03:06.322 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:03:31 np0005538960 podman[111027]: 2025-11-28 16:03:31.257719469 +0000 UTC m=+0.149576665 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:03:36 np0005538960 podman[111053]: 2025-11-28 16:03:36.157726415 +0000 UTC m=+0.066766126 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:03:38 np0005538960 kernel: SELinux:  Converting 2756 SID table entries...
Nov 28 11:03:38 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 11:03:38 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 11:03:38 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 11:03:38 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 11:03:38 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 11:03:38 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 11:03:38 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 11:03:49 np0005538960 kernel: SELinux:  Converting 2756 SID table entries...
Nov 28 11:03:49 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 11:03:49 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 11:03:49 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 11:03:49 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 11:03:49 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 11:03:49 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 11:03:49 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 11:04:02 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Nov 28 11:04:02 np0005538960 podman[111148]: 2025-11-28 16:04:02.27272611 +0000 UTC m=+0.138472484 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:04:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:04:06.322 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:04:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:04:06.323 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:04:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:04:06.323 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:04:07 np0005538960 podman[113828]: 2025-11-28 16:04:07.154222536 +0000 UTC m=+0.063394795 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Nov 28 11:04:33 np0005538960 podman[126954]: 2025-11-28 16:04:33.204949572 +0000 UTC m=+0.106372914 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:04:38 np0005538960 podman[127950]: 2025-11-28 16:04:38.152458484 +0000 UTC m=+0.055853855 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:04:55 np0005538960 kernel: SELinux:  Converting 2757 SID table entries...
Nov 28 11:04:55 np0005538960 kernel: SELinux:  policy capability network_peer_controls=1
Nov 28 11:04:55 np0005538960 kernel: SELinux:  policy capability open_perms=1
Nov 28 11:04:55 np0005538960 kernel: SELinux:  policy capability extended_socket_class=1
Nov 28 11:04:55 np0005538960 kernel: SELinux:  policy capability always_check_network=0
Nov 28 11:04:55 np0005538960 kernel: SELinux:  policy capability cgroup_seclabel=1
Nov 28 11:04:55 np0005538960 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Nov 28 11:04:55 np0005538960 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Nov 28 11:05:02 np0005538960 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Nov 28 11:05:02 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Nov 28 11:05:02 np0005538960 dbus-broker-launch[755]: Noticed file-system modification, trigger reload.
Nov 28 11:05:03 np0005538960 podman[128015]: 2025-11-28 16:05:03.761999757 +0000 UTC m=+0.114882607 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:05:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:05:06.323 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:05:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:05:06.324 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:05:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:05:06.324 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:05:09 np0005538960 podman[128079]: 2025-11-28 16:05:09.205202223 +0000 UTC m=+0.087046137 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:05:13 np0005538960 systemd[1]: Stopping OpenSSH server daemon...
Nov 28 11:05:13 np0005538960 systemd[1]: sshd.service: Deactivated successfully.
Nov 28 11:05:13 np0005538960 systemd[1]: Stopped OpenSSH server daemon.
Nov 28 11:05:13 np0005538960 systemd[1]: sshd.service: Consumed 2.720s CPU time, read 32.0K from disk, written 20.0K to disk.
Nov 28 11:05:13 np0005538960 systemd[1]: Stopped target sshd-keygen.target.
Nov 28 11:05:13 np0005538960 systemd[1]: Stopping sshd-keygen.target...
Nov 28 11:05:13 np0005538960 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 11:05:13 np0005538960 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 11:05:13 np0005538960 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Nov 28 11:05:13 np0005538960 systemd[1]: Reached target sshd-keygen.target.
Nov 28 11:05:13 np0005538960 systemd[1]: Starting OpenSSH server daemon...
Nov 28 11:05:13 np0005538960 systemd[1]: Started OpenSSH server daemon.
Nov 28 11:05:16 np0005538960 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 11:05:16 np0005538960 systemd[1]: Starting man-db-cache-update.service...
Nov 28 11:05:16 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:16 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:16 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:16 np0005538960 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 11:05:23 np0005538960 python3.9[135442]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 11:05:23 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:23 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:23 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:24 np0005538960 python3.9[136720]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 11:05:24 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:24 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:24 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:25 np0005538960 python3.9[137801]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 11:05:25 np0005538960 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 11:05:25 np0005538960 systemd[1]: Finished man-db-cache-update.service.
Nov 28 11:05:25 np0005538960 systemd[1]: man-db-cache-update.service: Consumed 12.301s CPU time.
Nov 28 11:05:25 np0005538960 systemd[1]: run-r47e19a395e1e4e09b95a83366189bba6.service: Deactivated successfully.
Nov 28 11:05:25 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:25 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:26 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:26 np0005538960 python3.9[138144]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 11:05:27 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:27 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:27 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:29 np0005538960 python3.9[138334]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:29 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:30 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:30 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:31 np0005538960 python3.9[138524]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:31 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:31 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:31 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:32 np0005538960 python3.9[138715]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:32 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:32 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:32 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:33 np0005538960 python3.9[138905]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:34 np0005538960 podman[138907]: 2025-11-28 16:05:34.142824661 +0000 UTC m=+0.136039094 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 11:05:34 np0005538960 python3.9[139088]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:35 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:35 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:35 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:36 np0005538960 python3.9[139278]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Nov 28 11:05:36 np0005538960 systemd[1]: Reloading.
Nov 28 11:05:37 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:05:37 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:05:37 np0005538960 systemd[1]: Listening on libvirt proxy daemon socket.
Nov 28 11:05:37 np0005538960 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Nov 28 11:05:38 np0005538960 python3.9[139472]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:39 np0005538960 python3.9[139629]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:39 np0005538960 podman[139756]: 2025-11-28 16:05:39.757929919 +0000 UTC m=+0.073062076 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:05:40 np0005538960 python3.9[139802]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:40 np0005538960 python3.9[139959]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:42 np0005538960 python3.9[140114]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:43 np0005538960 python3.9[140269]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:44 np0005538960 python3.9[140424]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:45 np0005538960 python3.9[140579]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:46 np0005538960 python3.9[140734]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:47 np0005538960 python3.9[140889]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:48 np0005538960 python3.9[141044]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:49 np0005538960 python3.9[141199]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:50 np0005538960 python3.9[141354]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:51 np0005538960 python3.9[141509]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Nov 28 11:05:52 np0005538960 python3.9[141664]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:05:53 np0005538960 python3.9[141816]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:05:54 np0005538960 python3.9[141968]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:05:55 np0005538960 python3.9[142120]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:05:56 np0005538960 python3.9[142272]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:05:56 np0005538960 python3.9[142424]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:05:57 np0005538960 python3.9[142576]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:05:58 np0005538960 python3.9[142701]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764345957.1489563-1623-6241029473960/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:05:59 np0005538960 python3.9[142853]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:00 np0005538960 python3.9[142978]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764345958.8313444-1623-111676200693818/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:00 np0005538960 python3.9[143130]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:01 np0005538960 python3.9[143255]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764345960.292885-1623-107021615469009/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:02 np0005538960 python3.9[143407]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:03 np0005538960 python3.9[143532]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764345961.963128-1623-257188127085000/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:04 np0005538960 python3.9[143684]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:04 np0005538960 podman[143781]: 2025-11-28 16:06:04.560973152 +0000 UTC m=+0.114971606 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:06:04 np0005538960 python3.9[143828]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764345963.3748322-1623-200927702731632/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:05 np0005538960 python3.9[143987]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:06 np0005538960 python3.9[144112]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764345964.867526-1623-258735665302074/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:06:06.325 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:06:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:06:06.325 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:06:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:06:06.326 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:06:06 np0005538960 python3.9[144264]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:07 np0005538960 python3.9[144387]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764345966.2955022-1623-244064386284125/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:08 np0005538960 python3.9[144539]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:08 np0005538960 python3.9[144664]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1764345967.70557-1623-161738693152014/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:09 np0005538960 python3.9[144818]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Nov 28 11:06:10 np0005538960 podman[144844]: 2025-11-28 16:06:10.149463032 +0000 UTC m=+0.054973389 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 11:06:10 np0005538960 python3.9[144991]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:11 np0005538960 python3.9[145143]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:12 np0005538960 python3.9[145295]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:13 np0005538960 python3.9[145447]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:13 np0005538960 python3.9[145599]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:14 np0005538960 python3.9[145751]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:15 np0005538960 python3.9[145903]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:16 np0005538960 python3.9[146055]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:16 np0005538960 python3.9[146208]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:17 np0005538960 python3.9[146360]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:18 np0005538960 python3.9[146512]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:18 np0005538960 python3.9[146664]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:19 np0005538960 python3.9[146816]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:20 np0005538960 python3.9[146968]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:21 np0005538960 python3.9[147120]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:22 np0005538960 python3.9[147243]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345981.333915-2286-134220228829075/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:23 np0005538960 python3.9[147395]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:23 np0005538960 python3.9[147518]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345982.747502-2286-19218180308234/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:24 np0005538960 python3.9[147670]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:25 np0005538960 python3.9[147793]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345984.1162138-2286-176596833025624/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:26 np0005538960 python3.9[147945]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:27 np0005538960 python3.9[148068]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345986.0549378-2286-267850862554615/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:28 np0005538960 python3.9[148220]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:28 np0005538960 python3.9[148343]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345987.4766865-2286-216259004666338/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:29 np0005538960 python3.9[148495]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:29 np0005538960 python3.9[148618]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345988.8678608-2286-20117900644552/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:30 np0005538960 python3.9[148771]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:31 np0005538960 python3.9[148894]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345990.14347-2286-112348902674631/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:31 np0005538960 python3.9[149046]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:32 np0005538960 python3.9[149169]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345991.4668105-2286-10125121182861/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:33 np0005538960 python3.9[149321]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:34 np0005538960 python3.9[149444]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345992.8560853-2286-91356861311491/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:34 np0005538960 podman[149596]: 2025-11-28 16:06:34.772019271 +0000 UTC m=+0.105893501 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:06:34 np0005538960 python3.9[149597]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:35 np0005538960 python3.9[149746]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345994.236201-2286-235444021672698/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:36 np0005538960 python3.9[149898]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:36 np0005538960 python3.9[150021]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345995.725507-2286-92256118439893/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:37 np0005538960 python3.9[150173]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:38 np0005538960 python3.9[150296]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345997.1227064-2286-156495421382669/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:39 np0005538960 python3.9[150448]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:39 np0005538960 python3.9[150571]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345998.526869-2286-123493779745406/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:40 np0005538960 podman[150723]: 2025-11-28 16:06:40.304850571 +0000 UTC m=+0.066739754 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 11:06:40 np0005538960 python3.9[150724]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:06:41 np0005538960 python3.9[150866]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764345999.854982-2286-184950797429232/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:41 np0005538960 python3.9[151016]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:06:42 np0005538960 python3.9[151171]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Nov 28 11:06:44 np0005538960 dbus-broker-launch[776]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Nov 28 11:06:45 np0005538960 python3.9[151327]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:45 np0005538960 python3.9[151479]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:48 np0005538960 python3.9[151631]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:49 np0005538960 python3.9[151783]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:50 np0005538960 python3.9[151935]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:51 np0005538960 python3.9[152087]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:51 np0005538960 python3.9[152239]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:52 np0005538960 python3.9[152391]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:53 np0005538960 python3.9[152543]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:54 np0005538960 python3.9[152695]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:06:57 np0005538960 python3.9[152847]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:06:57 np0005538960 systemd[1]: Reloading.
Nov 28 11:06:57 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:06:57 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:06:57 np0005538960 systemd[1]: Starting libvirt logging daemon socket...
Nov 28 11:06:57 np0005538960 systemd[1]: Listening on libvirt logging daemon socket.
Nov 28 11:06:57 np0005538960 systemd[1]: Starting libvirt logging daemon admin socket...
Nov 28 11:06:57 np0005538960 systemd[1]: Listening on libvirt logging daemon admin socket.
Nov 28 11:06:57 np0005538960 systemd[1]: Starting libvirt logging daemon...
Nov 28 11:06:57 np0005538960 systemd[1]: Started libvirt logging daemon.
Nov 28 11:06:58 np0005538960 python3.9[153040]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:06:58 np0005538960 systemd[1]: Reloading.
Nov 28 11:06:58 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:06:58 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:06:58 np0005538960 systemd[1]: Starting libvirt nodedev daemon socket...
Nov 28 11:06:58 np0005538960 systemd[1]: Listening on libvirt nodedev daemon socket.
Nov 28 11:06:58 np0005538960 systemd[1]: Starting libvirt nodedev daemon admin socket...
Nov 28 11:06:58 np0005538960 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Nov 28 11:06:58 np0005538960 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Nov 28 11:06:58 np0005538960 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Nov 28 11:06:58 np0005538960 systemd[1]: Starting libvirt nodedev daemon...
Nov 28 11:06:59 np0005538960 systemd[1]: Started libvirt nodedev daemon.
Nov 28 11:06:59 np0005538960 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Nov 28 11:06:59 np0005538960 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Nov 28 11:06:59 np0005538960 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Nov 28 11:06:59 np0005538960 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Nov 28 11:06:59 np0005538960 python3.9[153264]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:06:59 np0005538960 systemd[1]: Reloading.
Nov 28 11:07:00 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:07:00 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:07:00 np0005538960 setroubleshoot[153103]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l bfaed113-eb10-4bf6-b417-9877882c3bd1
Nov 28 11:07:00 np0005538960 setroubleshoot[153103]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 28 11:07:00 np0005538960 setroubleshoot[153103]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l bfaed113-eb10-4bf6-b417-9877882c3bd1
Nov 28 11:07:00 np0005538960 setroubleshoot[153103]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Nov 28 11:07:00 np0005538960 systemd[1]: Starting libvirt proxy daemon admin socket...
Nov 28 11:07:00 np0005538960 systemd[1]: Starting libvirt proxy daemon read-only socket...
Nov 28 11:07:00 np0005538960 systemd[1]: Listening on libvirt proxy daemon admin socket.
Nov 28 11:07:00 np0005538960 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Nov 28 11:07:00 np0005538960 systemd[1]: Starting libvirt proxy daemon...
Nov 28 11:07:00 np0005538960 systemd[1]: Started libvirt proxy daemon.
Nov 28 11:07:01 np0005538960 python3.9[153478]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:07:01 np0005538960 systemd[1]: Reloading.
Nov 28 11:07:01 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:07:01 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:07:01 np0005538960 systemd[1]: Listening on libvirt locking daemon socket.
Nov 28 11:07:01 np0005538960 systemd[1]: Starting libvirt QEMU daemon socket...
Nov 28 11:07:01 np0005538960 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Nov 28 11:07:01 np0005538960 systemd[1]: Starting Virtual Machine and Container Registration Service...
Nov 28 11:07:01 np0005538960 systemd[1]: Listening on libvirt QEMU daemon socket.
Nov 28 11:07:01 np0005538960 systemd[1]: Starting libvirt QEMU daemon admin socket...
Nov 28 11:07:01 np0005538960 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Nov 28 11:07:01 np0005538960 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Nov 28 11:07:01 np0005538960 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Nov 28 11:07:01 np0005538960 systemd[1]: Started Virtual Machine and Container Registration Service.
Nov 28 11:07:01 np0005538960 systemd[1]: Starting libvirt QEMU daemon...
Nov 28 11:07:01 np0005538960 systemd[1]: Started libvirt QEMU daemon.
Nov 28 11:07:02 np0005538960 python3.9[153693]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:07:02 np0005538960 systemd[1]: Reloading.
Nov 28 11:07:02 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:07:02 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:07:03 np0005538960 systemd[1]: Starting libvirt secret daemon socket...
Nov 28 11:07:03 np0005538960 systemd[1]: Listening on libvirt secret daemon socket.
Nov 28 11:07:03 np0005538960 systemd[1]: Starting libvirt secret daemon admin socket...
Nov 28 11:07:03 np0005538960 systemd[1]: Starting libvirt secret daemon read-only socket...
Nov 28 11:07:03 np0005538960 systemd[1]: Listening on libvirt secret daemon admin socket.
Nov 28 11:07:03 np0005538960 systemd[1]: Listening on libvirt secret daemon read-only socket.
Nov 28 11:07:03 np0005538960 systemd[1]: Starting libvirt secret daemon...
Nov 28 11:07:03 np0005538960 systemd[1]: Started libvirt secret daemon.
Nov 28 11:07:04 np0005538960 python3.9[153904]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:04 np0005538960 podman[154028]: 2025-11-28 16:07:04.942866637 +0000 UTC m=+0.103554586 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 11:07:05 np0005538960 python3.9[154077]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 11:07:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:07:06.326 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:07:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:07:06.326 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:07:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:07:06.327 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:07:06 np0005538960 python3.9[154235]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:07 np0005538960 python3.9[154358]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346025.9460206-3321-111802378077262/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:08 np0005538960 python3.9[154510]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:09 np0005538960 python3.9[154662]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:09 np0005538960 python3.9[154740]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:10 np0005538960 podman[154864]: 2025-11-28 16:07:10.530174959 +0000 UTC m=+0.069961494 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:07:10 np0005538960 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Nov 28 11:07:10 np0005538960 systemd[1]: setroubleshootd.service: Deactivated successfully.
Nov 28 11:07:10 np0005538960 python3.9[154911]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:11 np0005538960 python3.9[154989]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tsppt72i recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:12 np0005538960 python3.9[155141]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:12 np0005538960 python3.9[155219]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:13 np0005538960 python3.9[155371]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:07:14 np0005538960 python3[155526]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 11:07:15 np0005538960 python3.9[155678]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:15 np0005538960 python3.9[155756]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:16 np0005538960 python3.9[155908]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:17 np0005538960 python3.9[155986]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:18 np0005538960 python3.9[156138]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:18 np0005538960 python3.9[156216]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:19 np0005538960 python3.9[156368]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:20 np0005538960 python3.9[156446]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:21 np0005538960 python3.9[156598]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:21 np0005538960 python3.9[156723]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764346040.6281586-3696-251909520597488/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:23 np0005538960 python3.9[156875]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:23 np0005538960 python3.9[157027]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:07:25 np0005538960 python3.9[157182]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:26 np0005538960 python3.9[157334]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:07:27 np0005538960 python3.9[157487]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:07:28 np0005538960 python3.9[157641]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:07:28 np0005538960 python3.9[157796]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:29 np0005538960 python3.9[157948]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:30 np0005538960 python3.9[158071]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346049.2270374-3912-269410916627585/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:31 np0005538960 python3.9[158223]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:32 np0005538960 python3.9[158346]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346050.7706826-3957-141446557414313/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:33 np0005538960 python3.9[158498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:07:33 np0005538960 python3.9[158621]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346052.2646644-4003-31420485861966/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:07:34 np0005538960 python3.9[158773]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:07:34 np0005538960 systemd[1]: Reloading.
Nov 28 11:07:34 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:07:34 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:07:35 np0005538960 systemd[1]: Reached target edpm_libvirt.target.
Nov 28 11:07:35 np0005538960 podman[158811]: 2025-11-28 16:07:35.286889126 +0000 UTC m=+0.103951087 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:07:36 np0005538960 python3.9[158990]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Nov 28 11:07:36 np0005538960 systemd[1]: Reloading.
Nov 28 11:07:36 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:07:36 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:07:36 np0005538960 systemd[1]: Reloading.
Nov 28 11:07:36 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:07:36 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:07:37 np0005538960 systemd[1]: session-24.scope: Deactivated successfully.
Nov 28 11:07:37 np0005538960 systemd[1]: session-24.scope: Consumed 3min 49.951s CPU time.
Nov 28 11:07:37 np0005538960 systemd-logind[788]: Session 24 logged out. Waiting for processes to exit.
Nov 28 11:07:37 np0005538960 systemd-logind[788]: Removed session 24.
Nov 28 11:07:41 np0005538960 podman[159089]: 2025-11-28 16:07:41.198370391 +0000 UTC m=+0.087772906 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:07:43 np0005538960 systemd-logind[788]: New session 25 of user zuul.
Nov 28 11:07:43 np0005538960 systemd[1]: Started Session 25 of User zuul.
Nov 28 11:07:45 np0005538960 python3.9[159263]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 11:07:46 np0005538960 python3.9[159417]: ansible-ansible.builtin.service_facts Invoked
Nov 28 11:07:46 np0005538960 network[159434]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 11:07:46 np0005538960 network[159435]: 'network-scripts' will be removed from distribution in near future.
Nov 28 11:07:46 np0005538960 network[159436]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 11:07:52 np0005538960 python3.9[159707]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Nov 28 11:07:53 np0005538960 python3.9[159791]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 11:07:59 np0005538960 python3.9[159944]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:08:00 np0005538960 python3.9[160096]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:08:01 np0005538960 python3.9[160249]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:08:02 np0005538960 python3.9[160401]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:08:03 np0005538960 python3.9[160554]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:04 np0005538960 python3.9[160677]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346082.9062421-246-209816808130513/.source.iscsi _original_basename=.gdy09y4m follow=False checksum=d131660cae9a3956a3ee1fd3aae83aacd1fde6df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:05 np0005538960 python3.9[160829]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:05 np0005538960 podman[160953]: 2025-11-28 16:08:05.961798647 +0000 UTC m=+0.102596294 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:08:06 np0005538960 python3.9[161002]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:06 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 11:08:06 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 11:08:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:08:06.326 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:08:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:08:06.327 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:08:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:08:06.328 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:08:07 np0005538960 python3.9[161161]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:08:07 np0005538960 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Nov 28 11:08:08 np0005538960 python3.9[161317]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:08:08 np0005538960 systemd[1]: Reloading.
Nov 28 11:08:08 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:08:08 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:08:08 np0005538960 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 28 11:08:08 np0005538960 systemd[1]: Starting Open-iSCSI...
Nov 28 11:08:09 np0005538960 kernel: Loading iSCSI transport class v2.0-870.
Nov 28 11:08:09 np0005538960 systemd[1]: Started Open-iSCSI.
Nov 28 11:08:09 np0005538960 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Nov 28 11:08:09 np0005538960 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Nov 28 11:08:10 np0005538960 python3.9[161517]: ansible-ansible.builtin.service_facts Invoked
Nov 28 11:08:10 np0005538960 network[161534]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 11:08:10 np0005538960 network[161535]: 'network-scripts' will be removed from distribution in near future.
Nov 28 11:08:10 np0005538960 network[161536]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 11:08:11 np0005538960 podman[161548]: 2025-11-28 16:08:11.362579544 +0000 UTC m=+0.110853397 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent)
Nov 28 11:08:17 np0005538960 python3.9[161828]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 11:08:18 np0005538960 python3.9[161980]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Nov 28 11:08:19 np0005538960 python3.9[162136]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:19 np0005538960 python3.9[162259]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346098.5434923-477-216931532641521/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:20 np0005538960 python3.9[162411]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:22 np0005538960 python3.9[162563]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:08:22 np0005538960 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 11:08:22 np0005538960 systemd[1]: Stopped Load Kernel Modules.
Nov 28 11:08:22 np0005538960 systemd[1]: Stopping Load Kernel Modules...
Nov 28 11:08:22 np0005538960 systemd[1]: Starting Load Kernel Modules...
Nov 28 11:08:22 np0005538960 systemd[1]: Finished Load Kernel Modules.
Nov 28 11:08:23 np0005538960 python3.9[162719]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:08:24 np0005538960 python3.9[162871]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:08:25 np0005538960 python3.9[163023]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:08:25 np0005538960 python3.9[163175]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:26 np0005538960 python3.9[163298]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346105.3864872-651-222236473554486/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:27 np0005538960 python3.9[163450]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:08:28 np0005538960 python3.9[163603]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:29 np0005538960 python3.9[163755]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:30 np0005538960 python3.9[163907]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:31 np0005538960 python3.9[164059]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:32 np0005538960 python3.9[164211]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:33 np0005538960 python3.9[164363]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:34 np0005538960 python3.9[164515]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:35 np0005538960 python3.9[164667]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:08:35 np0005538960 python3.9[164821]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:36 np0005538960 podman[164846]: 2025-11-28 16:08:36.264165943 +0000 UTC m=+0.147318015 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:08:37 np0005538960 python3.9[164998]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:08:38 np0005538960 python3.9[165150]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:38 np0005538960 python3.9[165228]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:08:39 np0005538960 python3.9[165380]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:39 np0005538960 python3.9[165458]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:08:40 np0005538960 python3.9[165610]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:41 np0005538960 podman[165762]: 2025-11-28 16:08:41.514807928 +0000 UTC m=+0.066769544 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 11:08:41 np0005538960 python3.9[165763]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:42 np0005538960 python3.9[165859]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:43 np0005538960 python3.9[166011]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:43 np0005538960 python3.9[166089]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:44 np0005538960 python3.9[166241]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:08:44 np0005538960 systemd[1]: Reloading.
Nov 28 11:08:44 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:08:44 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:08:45 np0005538960 python3.9[166429]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:46 np0005538960 python3.9[166507]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:47 np0005538960 python3.9[166659]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:47 np0005538960 python3.9[166739]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:48 np0005538960 python3.9[166891]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:08:48 np0005538960 systemd[1]: Reloading.
Nov 28 11:08:48 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:08:48 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:08:49 np0005538960 systemd[1]: Starting Create netns directory...
Nov 28 11:08:49 np0005538960 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Nov 28 11:08:49 np0005538960 systemd[1]: netns-placeholder.service: Deactivated successfully.
Nov 28 11:08:49 np0005538960 systemd[1]: Finished Create netns directory.
Nov 28 11:08:50 np0005538960 python3.9[167085]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:08:50 np0005538960 python3.9[167237]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:51 np0005538960 python3.9[167360]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346130.3472517-1272-102670627138487/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:08:52 np0005538960 python3.9[167512]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:08:53 np0005538960 python3.9[167664]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:08:54 np0005538960 python3.9[167789]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346133.1791131-1347-256716016306364/.source.json _original_basename=.52n12kx7 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:55 np0005538960 python3.9[167941]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:08:58 np0005538960 python3.9[168369]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Nov 28 11:08:59 np0005538960 systemd[1]: virtnodedevd.service: Deactivated successfully.
Nov 28 11:08:59 np0005538960 python3.9[168521]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:09:00 np0005538960 python3.9[168674]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Nov 28 11:09:00 np0005538960 systemd[1]: virtproxyd.service: Deactivated successfully.
Nov 28 11:09:01 np0005538960 systemd[1]: virtqemud.service: Deactivated successfully.
Nov 28 11:09:02 np0005538960 python3[168854]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:09:02 np0005538960 podman[168892]: 2025-11-28 16:09:02.587940212 +0000 UTC m=+0.073140649 container create 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 11:09:02 np0005538960 podman[168892]: 2025-11-28 16:09:02.553303777 +0000 UTC m=+0.038504284 image pull f275b8d168f7f57f31e3da49224019f39f95c80a833f083696a964527b07b54f quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 11:09:02 np0005538960 python3[168854]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Nov 28 11:09:03 np0005538960 systemd[1]: virtsecretd.service: Deactivated successfully.
Nov 28 11:09:03 np0005538960 python3.9[169083]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:09:04 np0005538960 python3.9[169237]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:05 np0005538960 python3.9[169313]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:09:05 np0005538960 python3.9[169464]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764346145.215522-1611-171729262126839/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:09:06.328 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:09:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:09:06.331 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:09:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:09:06.331 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:09:06 np0005538960 podman[169540]: 2025-11-28 16:09:06.503776773 +0000 UTC m=+0.112156495 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:09:06 np0005538960 python3.9[169541]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:09:06 np0005538960 systemd[1]: Reloading.
Nov 28 11:09:06 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:09:06 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:09:07 np0005538960 python3.9[169676]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:07 np0005538960 systemd[1]: Reloading.
Nov 28 11:09:07 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:09:07 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:09:08 np0005538960 systemd[1]: Starting multipathd container...
Nov 28 11:09:08 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:09:08 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc586e6678dbb0fae5c778d48ed4c3261e7df7689d83f48fbc28e8162acf2656/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 11:09:08 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc586e6678dbb0fae5c778d48ed4c3261e7df7689d83f48fbc28e8162acf2656/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 11:09:08 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a.
Nov 28 11:09:08 np0005538960 podman[169716]: 2025-11-28 16:09:08.49852998 +0000 UTC m=+0.337582362 container init 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd)
Nov 28 11:09:08 np0005538960 multipathd[169731]: + sudo -E kolla_set_configs
Nov 28 11:09:08 np0005538960 podman[169716]: 2025-11-28 16:09:08.538848068 +0000 UTC m=+0.377900420 container start 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:09:08 np0005538960 podman[169716]: multipathd
Nov 28 11:09:08 np0005538960 systemd[1]: Started multipathd container.
Nov 28 11:09:08 np0005538960 multipathd[169731]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 11:09:08 np0005538960 multipathd[169731]: INFO:__main__:Validating config file
Nov 28 11:09:08 np0005538960 multipathd[169731]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 11:09:08 np0005538960 multipathd[169731]: INFO:__main__:Writing out command to execute
Nov 28 11:09:08 np0005538960 multipathd[169731]: ++ cat /run_command
Nov 28 11:09:08 np0005538960 multipathd[169731]: + CMD='/usr/sbin/multipathd -d'
Nov 28 11:09:08 np0005538960 multipathd[169731]: + ARGS=
Nov 28 11:09:08 np0005538960 multipathd[169731]: + sudo kolla_copy_cacerts
Nov 28 11:09:08 np0005538960 multipathd[169731]: + [[ ! -n '' ]]
Nov 28 11:09:08 np0005538960 multipathd[169731]: + . kolla_extend_start
Nov 28 11:09:08 np0005538960 multipathd[169731]: Running command: '/usr/sbin/multipathd -d'
Nov 28 11:09:08 np0005538960 multipathd[169731]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 11:09:08 np0005538960 multipathd[169731]: + umask 0022
Nov 28 11:09:08 np0005538960 multipathd[169731]: + exec /usr/sbin/multipathd -d
Nov 28 11:09:08 np0005538960 multipathd[169731]: 3248.304387 | --------start up--------
Nov 28 11:09:08 np0005538960 multipathd[169731]: 3248.304411 | read /etc/multipath.conf
Nov 28 11:09:08 np0005538960 multipathd[169731]: 3248.311726 | path checkers start up
Nov 28 11:09:08 np0005538960 podman[169738]: 2025-11-28 16:09:08.687791352 +0000 UTC m=+0.122006642 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 28 11:09:08 np0005538960 systemd[1]: 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a-3c37a25ae6aee23a.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:09:08 np0005538960 systemd[1]: 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a-3c37a25ae6aee23a.service: Failed with result 'exit-code'.
Nov 28 11:09:11 np0005538960 python3.9[169921]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:09:11 np0005538960 podman[170048]: 2025-11-28 16:09:11.938582153 +0000 UTC m=+0.109241502 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:09:12 np0005538960 python3.9[170095]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:09:12 np0005538960 python3.9[170260]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:09:13 np0005538960 systemd[1]: Stopping multipathd container...
Nov 28 11:09:13 np0005538960 multipathd[169731]: 3252.754409 | exit (signal)
Nov 28 11:09:13 np0005538960 multipathd[169731]: 3252.754515 | --------shut down-------
Nov 28 11:09:13 np0005538960 systemd[1]: libpod-7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a.scope: Deactivated successfully.
Nov 28 11:09:13 np0005538960 podman[170264]: 2025-11-28 16:09:13.151468429 +0000 UTC m=+0.087702364 container died 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Nov 28 11:09:13 np0005538960 systemd[1]: 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a-3c37a25ae6aee23a.timer: Deactivated successfully.
Nov 28 11:09:13 np0005538960 systemd[1]: Stopped /usr/bin/podman healthcheck run 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a.
Nov 28 11:09:13 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a-userdata-shm.mount: Deactivated successfully.
Nov 28 11:09:13 np0005538960 systemd[1]: var-lib-containers-storage-overlay-bc586e6678dbb0fae5c778d48ed4c3261e7df7689d83f48fbc28e8162acf2656-merged.mount: Deactivated successfully.
Nov 28 11:09:13 np0005538960 podman[170264]: 2025-11-28 16:09:13.21148396 +0000 UTC m=+0.147717895 container cleanup 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 11:09:13 np0005538960 podman[170264]: multipathd
Nov 28 11:09:13 np0005538960 podman[170290]: multipathd
Nov 28 11:09:13 np0005538960 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Nov 28 11:09:13 np0005538960 systemd[1]: Stopped multipathd container.
Nov 28 11:09:13 np0005538960 systemd[1]: Starting multipathd container...
Nov 28 11:09:13 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:09:13 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc586e6678dbb0fae5c778d48ed4c3261e7df7689d83f48fbc28e8162acf2656/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 11:09:13 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc586e6678dbb0fae5c778d48ed4c3261e7df7689d83f48fbc28e8162acf2656/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 11:09:13 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a.
Nov 28 11:09:13 np0005538960 podman[170302]: 2025-11-28 16:09:13.477053989 +0000 UTC m=+0.160093733 container init 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 11:09:13 np0005538960 multipathd[170318]: + sudo -E kolla_set_configs
Nov 28 11:09:13 np0005538960 podman[170302]: 2025-11-28 16:09:13.505536452 +0000 UTC m=+0.188576166 container start 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Nov 28 11:09:13 np0005538960 podman[170302]: multipathd
Nov 28 11:09:13 np0005538960 systemd[1]: Started multipathd container.
Nov 28 11:09:13 np0005538960 multipathd[170318]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 11:09:13 np0005538960 multipathd[170318]: INFO:__main__:Validating config file
Nov 28 11:09:13 np0005538960 multipathd[170318]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 11:09:13 np0005538960 multipathd[170318]: INFO:__main__:Writing out command to execute
Nov 28 11:09:13 np0005538960 multipathd[170318]: ++ cat /run_command
Nov 28 11:09:13 np0005538960 multipathd[170318]: + CMD='/usr/sbin/multipathd -d'
Nov 28 11:09:13 np0005538960 multipathd[170318]: + ARGS=
Nov 28 11:09:13 np0005538960 multipathd[170318]: + sudo kolla_copy_cacerts
Nov 28 11:09:13 np0005538960 multipathd[170318]: + [[ ! -n '' ]]
Nov 28 11:09:13 np0005538960 multipathd[170318]: + . kolla_extend_start
Nov 28 11:09:13 np0005538960 multipathd[170318]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Nov 28 11:09:13 np0005538960 multipathd[170318]: Running command: '/usr/sbin/multipathd -d'
Nov 28 11:09:13 np0005538960 multipathd[170318]: + umask 0022
Nov 28 11:09:13 np0005538960 multipathd[170318]: + exec /usr/sbin/multipathd -d
Nov 28 11:09:13 np0005538960 multipathd[170318]: 3253.252459 | --------start up--------
Nov 28 11:09:13 np0005538960 multipathd[170318]: 3253.252479 | read /etc/multipath.conf
Nov 28 11:09:13 np0005538960 multipathd[170318]: 3253.259033 | path checkers start up
Nov 28 11:09:13 np0005538960 podman[170325]: 2025-11-28 16:09:13.626042495 +0000 UTC m=+0.104040602 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 28 11:09:13 np0005538960 systemd[1]: 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a-357d3e96d3bc2777.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:09:13 np0005538960 systemd[1]: 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a-357d3e96d3bc2777.service: Failed with result 'exit-code'.
Nov 28 11:09:14 np0005538960 python3.9[170507]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:15 np0005538960 python3.9[170659]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Nov 28 11:09:16 np0005538960 python3.9[170811]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Nov 28 11:09:16 np0005538960 kernel: Key type psk registered
Nov 28 11:09:17 np0005538960 python3.9[170974]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:09:18 np0005538960 python3.9[171097]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346156.8827932-1851-5280848427094/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:19 np0005538960 python3.9[171249]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:20 np0005538960 python3.9[171401]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:09:20 np0005538960 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Nov 28 11:09:20 np0005538960 systemd[1]: Stopped Load Kernel Modules.
Nov 28 11:09:20 np0005538960 systemd[1]: Stopping Load Kernel Modules...
Nov 28 11:09:20 np0005538960 systemd[1]: Starting Load Kernel Modules...
Nov 28 11:09:20 np0005538960 systemd[1]: Finished Load Kernel Modules.
Nov 28 11:09:21 np0005538960 python3.9[171557]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Nov 28 11:09:24 np0005538960 systemd[1]: Reloading.
Nov 28 11:09:24 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:09:24 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:09:24 np0005538960 systemd[1]: Reloading.
Nov 28 11:09:24 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:09:24 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:09:25 np0005538960 systemd-logind[788]: Watching system buttons on /dev/input/event0 (Power Button)
Nov 28 11:09:25 np0005538960 systemd-logind[788]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Nov 28 11:09:25 np0005538960 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Nov 28 11:09:25 np0005538960 systemd[1]: Starting man-db-cache-update.service...
Nov 28 11:09:25 np0005538960 systemd[1]: Reloading.
Nov 28 11:09:25 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:09:25 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:09:25 np0005538960 systemd[1]: Queuing reload/restart jobs for marked units…
Nov 28 11:09:27 np0005538960 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Nov 28 11:09:27 np0005538960 systemd[1]: Finished man-db-cache-update.service.
Nov 28 11:09:27 np0005538960 systemd[1]: man-db-cache-update.service: Consumed 1.591s CPU time.
Nov 28 11:09:27 np0005538960 systemd[1]: run-rf433aa33965b4320ab4439789d765667.service: Deactivated successfully.
Nov 28 11:09:27 np0005538960 python3.9[173010]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:09:27 np0005538960 systemd[1]: Stopping Open-iSCSI...
Nov 28 11:09:27 np0005538960 iscsid[161357]: iscsid shutting down.
Nov 28 11:09:27 np0005538960 systemd[1]: iscsid.service: Deactivated successfully.
Nov 28 11:09:27 np0005538960 systemd[1]: Stopped Open-iSCSI.
Nov 28 11:09:27 np0005538960 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Nov 28 11:09:27 np0005538960 systemd[1]: Starting Open-iSCSI...
Nov 28 11:09:27 np0005538960 systemd[1]: Started Open-iSCSI.
Nov 28 11:09:28 np0005538960 python3.9[173166]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 11:09:29 np0005538960 python3.9[173322]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:30 np0005538960 python3.9[173474]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:09:30 np0005538960 systemd[1]: Reloading.
Nov 28 11:09:31 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:09:31 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:09:32 np0005538960 python3.9[173658]: ansible-ansible.builtin.service_facts Invoked
Nov 28 11:09:32 np0005538960 network[173675]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 11:09:32 np0005538960 network[173676]: 'network-scripts' will be removed from distribution in near future.
Nov 28 11:09:32 np0005538960 network[173677]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 11:09:36 np0005538960 podman[173749]: 2025-11-28 16:09:36.675403173 +0000 UTC m=+0.112392591 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:09:39 np0005538960 python3.9[173979]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:39 np0005538960 python3.9[174132]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:40 np0005538960 python3.9[174285]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:41 np0005538960 python3.9[174438]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:42 np0005538960 podman[174516]: 2025-11-28 16:09:42.187618908 +0000 UTC m=+0.084405691 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 11:09:42 np0005538960 python3.9[174608]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:43 np0005538960 python3.9[174761]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:44 np0005538960 podman[174886]: 2025-11-28 16:09:44.078920078 +0000 UTC m=+0.089141731 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 11:09:44 np0005538960 python3.9[174934]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:45 np0005538960 python3.9[175088]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:09:47 np0005538960 python3.9[175241]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:48 np0005538960 python3.9[175393]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:48 np0005538960 python3.9[175545]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:49 np0005538960 python3.9[175697]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:50 np0005538960 python3.9[175849]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:50 np0005538960 python3.9[176001]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:51 np0005538960 python3.9[176153]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:52 np0005538960 python3.9[176305]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:53 np0005538960 python3.9[176458]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:54 np0005538960 python3.9[176610]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:55 np0005538960 python3.9[176762]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:56 np0005538960 python3.9[176914]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:56 np0005538960 python3.9[177066]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:57 np0005538960 python3.9[177219]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:58 np0005538960 python3.9[177371]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:09:59 np0005538960 python3.9[177523]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:10:00 np0005538960 python3.9[177677]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:01 np0005538960 python3.9[177829]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 11:10:02 np0005538960 python3.9[177981]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:10:02 np0005538960 systemd[1]: Reloading.
Nov 28 11:10:02 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:10:02 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:10:03 np0005538960 python3.9[178168]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:04 np0005538960 python3.9[178321]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:04 np0005538960 python3.9[178474]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:05 np0005538960 python3.9[178627]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:10:06.328 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:10:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:10:06.329 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:10:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:10:06.329 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:10:06 np0005538960 python3.9[178780]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:06 np0005538960 podman[178782]: 2025-11-28 16:10:06.902213459 +0000 UTC m=+0.144864205 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Nov 28 11:10:07 np0005538960 python3.9[178959]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:08 np0005538960 python3.9[179112]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:09 np0005538960 python3.9[179265]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:10:10 np0005538960 python3.9[179420]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:11 np0005538960 python3.9[179572]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:12 np0005538960 podman[179696]: 2025-11-28 16:10:12.646537705 +0000 UTC m=+0.092950692 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 11:10:12 np0005538960 python3.9[179741]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:13 np0005538960 python3.9[179895]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:14 np0005538960 podman[180019]: 2025-11-28 16:10:14.23094102 +0000 UTC m=+0.083471362 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Nov 28 11:10:14 np0005538960 python3.9[180068]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:15 np0005538960 python3.9[180220]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:15 np0005538960 python3.9[180372]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:16 np0005538960 python3.9[180524]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:17 np0005538960 python3.9[180676]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:18 np0005538960 python3.9[180828]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:23 np0005538960 python3.9[180980]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Nov 28 11:10:24 np0005538960 python3.9[181133]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 11:10:25 np0005538960 python3.9[181291]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 11:10:27 np0005538960 systemd-logind[788]: New session 26 of user zuul.
Nov 28 11:10:27 np0005538960 systemd[1]: Started Session 26 of User zuul.
Nov 28 11:10:27 np0005538960 systemd[1]: session-26.scope: Deactivated successfully.
Nov 28 11:10:27 np0005538960 systemd-logind[788]: Session 26 logged out. Waiting for processes to exit.
Nov 28 11:10:27 np0005538960 systemd-logind[788]: Removed session 26.
Nov 28 11:10:28 np0005538960 python3.9[181477]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:29 np0005538960 python3.9[181598]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346227.8906033-3414-191169352570311/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:29 np0005538960 python3.9[181748]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:30 np0005538960 python3.9[181824]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:31 np0005538960 python3.9[181974]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:31 np0005538960 python3.9[182095]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346230.5068772-3414-9385816577003/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:32 np0005538960 python3.9[182245]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:33 np0005538960 python3.9[182366]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346231.9820704-3414-38473100818206/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:33 np0005538960 python3.9[182516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:34 np0005538960 python3.9[182637]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346233.2806346-3414-49853900693673/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:35 np0005538960 python3.9[182787]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:35 np0005538960 python3.9[182908]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346234.62128-3414-256148173554845/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:36 np0005538960 python3.9[183060]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:10:37 np0005538960 podman[183160]: 2025-11-28 16:10:37.197553732 +0000 UTC m=+0.103220683 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 11:10:37 np0005538960 python3.9[183236]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:10:38 np0005538960 python3.9[183388]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:10:38 np0005538960 python3.9[183540]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:39 np0005538960 python3.9[183663]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1764346238.4619126-3735-123303028511928/.source _original_basename=.updjxgk4 follow=False checksum=425a90a5328c1dd193520770636816f4e61723c0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Nov 28 11:10:40 np0005538960 python3.9[183815]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:10:41 np0005538960 python3.9[183967]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:41 np0005538960 python3.9[184088]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346240.7050774-3813-221296578172170/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:42 np0005538960 python3.9[184238]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:10:42 np0005538960 podman[184333]: 2025-11-28 16:10:42.958154584 +0000 UTC m=+0.065084734 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:10:43 np0005538960 python3.9[184372]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346242.0234494-3859-190121452002803/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:10:44 np0005538960 python3.9[184530]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Nov 28 11:10:44 np0005538960 podman[184654]: 2025-11-28 16:10:44.757866817 +0000 UTC m=+0.067557695 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 11:10:44 np0005538960 python3.9[184700]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:10:46 np0005538960 python3[184852]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:10:46 np0005538960 podman[184890]: 2025-11-28 16:10:46.237669997 +0000 UTC m=+0.057490430 container create 401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:10:46 np0005538960 podman[184890]: 2025-11-28 16:10:46.209332228 +0000 UTC m=+0.029152671 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 11:10:46 np0005538960 python3[184852]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Nov 28 11:10:47 np0005538960 python3.9[185077]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:10:48 np0005538960 python3.9[185231]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Nov 28 11:10:49 np0005538960 python3.9[185383]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:10:51 np0005538960 python3[185535]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:10:51 np0005538960 podman[185572]: 2025-11-28 16:10:51.697060811 +0000 UTC m=+0.072671319 image pull b65793e7266422f5b94c32d109b906c8ffd974cf2ddf0b6929e463e29e05864a quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Nov 28 11:10:52 np0005538960 podman[185572]: 2025-11-28 16:10:52.739650054 +0000 UTC m=+1.115260512 container create 6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:10:52 np0005538960 python3[185535]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Nov 28 11:10:53 np0005538960 python3.9[185762]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:10:54 np0005538960 python3.9[185916]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:10:55 np0005538960 python3.9[186067]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764346254.735321-4134-224339737602886/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:10:55 np0005538960 python3.9[186143]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:10:55 np0005538960 systemd[1]: Reloading.
Nov 28 11:10:56 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:10:56 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:10:56 np0005538960 python3.9[186253]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:10:56 np0005538960 systemd[1]: Reloading.
Nov 28 11:10:56 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:10:56 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:10:57 np0005538960 systemd[1]: Starting nova_compute container...
Nov 28 11:10:57 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:10:57 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 11:10:57 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 11:10:57 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 11:10:57 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 11:10:57 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 11:10:58 np0005538960 podman[186293]: 2025-11-28 16:10:58.016230738 +0000 UTC m=+0.475379708 container init 6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Nov 28 11:10:58 np0005538960 podman[186293]: 2025-11-28 16:10:58.024700953 +0000 UTC m=+0.483849903 container start 6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible)
Nov 28 11:10:58 np0005538960 podman[186293]: nova_compute
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + sudo -E kolla_set_configs
Nov 28 11:10:58 np0005538960 systemd[1]: Started nova_compute container.
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Validating config file
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying service configuration files
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Deleting /etc/ceph
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Creating directory /etc/ceph
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Writing out command to execute
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 11:10:58 np0005538960 nova_compute[186309]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 11:10:58 np0005538960 nova_compute[186309]: ++ cat /run_command
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + CMD=nova-compute
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + ARGS=
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + sudo kolla_copy_cacerts
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + [[ ! -n '' ]]
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + . kolla_extend_start
Nov 28 11:10:58 np0005538960 nova_compute[186309]: Running command: 'nova-compute'
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + umask 0022
Nov 28 11:10:58 np0005538960 nova_compute[186309]: + exec nova-compute
Nov 28 11:10:59 np0005538960 python3.9[186471]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:11:00 np0005538960 nova_compute[186309]: 2025-11-28 16:11:00.312 186313 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 11:11:00 np0005538960 nova_compute[186309]: 2025-11-28 16:11:00.313 186313 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 11:11:00 np0005538960 nova_compute[186309]: 2025-11-28 16:11:00.313 186313 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 11:11:00 np0005538960 nova_compute[186309]: 2025-11-28 16:11:00.313 186313 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 28 11:11:00 np0005538960 python3.9[186621]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:11:00 np0005538960 nova_compute[186309]: 2025-11-28 16:11:00.521 186313 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:11:00 np0005538960 nova_compute[186309]: 2025-11-28 16:11:00.553 186313 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:11:00 np0005538960 nova_compute[186309]: 2025-11-28 16:11:00.554 186313 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.075 186313 INFO nova.virt.driver [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.210 186313 INFO nova.compute.provider_config [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.224 186313 DEBUG oslo_concurrency.lockutils [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.224 186313 DEBUG oslo_concurrency.lockutils [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.225 186313 DEBUG oslo_concurrency.lockutils [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.225 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.225 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.225 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.225 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.226 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.226 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.226 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.226 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.226 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.226 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.226 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.227 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.227 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.227 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.227 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.227 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.227 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.227 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.228 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.228 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.228 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.228 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.228 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.228 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.228 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.229 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.229 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.229 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.229 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.229 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.229 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.230 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.230 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.230 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.230 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.230 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.230 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.230 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.231 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.231 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.231 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.231 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.231 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.231 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.232 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.232 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.232 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.232 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.232 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.232 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.232 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.233 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.233 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.233 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.233 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.233 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.233 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.234 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.234 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.234 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.234 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.234 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.234 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.235 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.235 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.235 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.235 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.235 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.235 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.236 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.236 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.236 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.236 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.236 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.236 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.237 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.237 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.237 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.237 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.237 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.238 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.238 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.238 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.238 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.238 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.239 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.239 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.239 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.239 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.239 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.239 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.240 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.240 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.240 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.240 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.240 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.240 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.240 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.241 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.241 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.241 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.241 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.241 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.241 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.241 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.242 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.242 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.242 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.242 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.242 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.242 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.242 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.243 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.243 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.243 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.243 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.243 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.243 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.243 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.244 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.244 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.244 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.244 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.244 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.245 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.245 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.245 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.245 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.245 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.245 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.245 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.246 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.246 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.246 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.246 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.246 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.246 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.246 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.247 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.247 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.247 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.247 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.247 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.247 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.248 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.248 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.248 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.248 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.248 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.248 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.249 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.249 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.249 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.249 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.249 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.249 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.249 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.250 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.250 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.250 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.250 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.250 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.251 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.251 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.251 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.251 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.251 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.251 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.251 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.252 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.252 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.252 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.252 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.252 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.252 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.253 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.253 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.253 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.253 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.253 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.254 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.254 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.254 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.254 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.254 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.254 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.254 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.255 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.255 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.255 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.255 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.255 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.256 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.256 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.256 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.256 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.256 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.256 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.256 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.257 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.257 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.257 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.257 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.257 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.257 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.258 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.258 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.258 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.258 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.258 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.258 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.258 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.259 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.259 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.259 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.259 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.259 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.259 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.259 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.260 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.260 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.260 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.260 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.260 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.261 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.261 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.261 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.261 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.261 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.261 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.262 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.262 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.262 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.262 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.262 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.263 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.263 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.263 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.263 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.263 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.263 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.264 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.264 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.264 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.264 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.264 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.265 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.265 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.265 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.265 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.265 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.265 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.265 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.266 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.266 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.266 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.266 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.266 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.267 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.267 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.267 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.267 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.267 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.267 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.267 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.268 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.268 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.268 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.268 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.268 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.268 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.269 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.269 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.269 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.269 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.269 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.270 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.270 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.270 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.270 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.270 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.270 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.271 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.271 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.271 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.271 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.271 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.271 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.271 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.272 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.272 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.272 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.272 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.272 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.272 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.272 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.273 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.273 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.273 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.273 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.273 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.273 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.274 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.274 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.274 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.274 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.274 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.274 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.275 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.275 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.275 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.275 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.275 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.275 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.275 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.276 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.276 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.276 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.276 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.276 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.277 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.277 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.277 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.277 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.277 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.277 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.278 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.278 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.278 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.278 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.278 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.278 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.278 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.279 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.279 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.279 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.279 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.279 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.279 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.279 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.280 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.280 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.280 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.280 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.281 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.281 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.281 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.281 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.281 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.281 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.282 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.282 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.282 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.282 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.282 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.283 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.283 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.283 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.283 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.283 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.284 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.284 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.284 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.284 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.284 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.284 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.284 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.285 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.285 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.285 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.285 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.285 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.285 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.285 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.286 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.286 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.286 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.286 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.286 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.286 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.286 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.287 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.287 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.287 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.287 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.287 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.287 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.287 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.288 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.288 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.288 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.288 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.288 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.288 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.289 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.289 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.289 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.289 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.289 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.289 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.289 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.290 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.290 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.290 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.290 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.290 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.290 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.290 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.291 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.291 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.291 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.291 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.291 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.291 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.291 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.292 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.292 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.292 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.292 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.292 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.292 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.292 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.293 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.293 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.293 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.293 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.293 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.293 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.294 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.294 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.294 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.294 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.294 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.294 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.294 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.295 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.295 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.295 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.295 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.295 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.295 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.295 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.296 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.296 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.296 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.296 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.296 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.296 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.296 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.297 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.297 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.297 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.297 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.297 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.297 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.298 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.298 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.298 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.298 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.298 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.298 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.298 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.299 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.299 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.299 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.299 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.299 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.299 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.299 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.300 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.300 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.300 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.300 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.300 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.300 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.301 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.301 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.301 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.301 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.301 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.301 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.301 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.302 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.302 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.302 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.302 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.302 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.302 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.302 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.303 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.303 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.303 186313 WARNING oslo_config.cfg [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 11:11:01 np0005538960 nova_compute[186309]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 11:11:01 np0005538960 nova_compute[186309]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 11:11:01 np0005538960 nova_compute[186309]: and ``live_migration_inbound_addr`` respectively.
Nov 28 11:11:01 np0005538960 nova_compute[186309]: ).  Its value may be silently ignored in the future.#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.303 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.303 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.304 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.304 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.304 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.304 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.304 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.305 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.305 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.305 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.305 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.305 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.305 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.306 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.306 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.306 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.306 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.306 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.306 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.306 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.307 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.307 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.307 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.307 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.307 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.307 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.307 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.308 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.308 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.308 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.308 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.308 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.308 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.309 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.309 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.309 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.309 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.309 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.309 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.309 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.310 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.310 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.310 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.310 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.310 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.310 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.310 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.311 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.311 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.311 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.311 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.311 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.311 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.312 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.312 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.312 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.312 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.312 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.312 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.313 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.313 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.313 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.313 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.313 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.313 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.313 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.313 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.314 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.314 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.314 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.314 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.314 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.314 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.314 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.315 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.315 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.315 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.315 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.315 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.315 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.315 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.316 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.316 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.316 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.316 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.316 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.316 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.316 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.317 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.317 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.317 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.317 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.317 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.317 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.318 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.318 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.318 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.318 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.318 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.318 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.318 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.319 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.319 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.319 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.319 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.319 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.319 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.319 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.320 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.320 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.320 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.320 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.320 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.320 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.320 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.321 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.321 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.321 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.321 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.321 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.321 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.321 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.322 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.322 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.322 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.322 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.322 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.322 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.322 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.323 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.323 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.323 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.323 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.323 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.323 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.323 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.324 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.324 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.324 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.324 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.324 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.324 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.325 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.325 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.325 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.325 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.325 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.325 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.325 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.326 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.326 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.326 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.326 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.326 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.326 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.327 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.327 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.327 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.327 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.327 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.327 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.327 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.328 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.328 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.328 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.328 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.328 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.328 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.328 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.329 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.329 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.329 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.329 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.329 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.329 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.329 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.330 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.330 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.330 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.330 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.330 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.331 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.331 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.331 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.331 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.331 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.331 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.331 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.332 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.332 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.332 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.332 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.332 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.332 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.332 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.333 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.333 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.333 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.333 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.333 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.334 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.334 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.334 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.334 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.334 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.334 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.334 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.335 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.335 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.335 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.335 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.335 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.335 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.335 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.336 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.336 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.336 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.336 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.336 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.336 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.336 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.337 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.337 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.337 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.337 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.337 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.337 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.337 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.338 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.338 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.338 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.338 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.338 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.338 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.339 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.339 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.339 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.339 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.339 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.339 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.340 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.340 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.340 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.340 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.340 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.340 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.341 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.341 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.341 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.341 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.342 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.342 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.342 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.342 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.342 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.343 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.343 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.343 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.343 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.344 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.344 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.344 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.344 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.344 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.345 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.345 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.345 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.345 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.345 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.346 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.346 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.346 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.346 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.346 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.347 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.347 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.347 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.347 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.347 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.348 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.348 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.348 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.348 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.348 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.348 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.349 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.349 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.349 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.349 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.349 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.349 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.350 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.350 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.350 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.350 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.350 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.350 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.351 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.351 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.351 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.351 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.351 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.351 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.351 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.352 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.352 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.352 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.352 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.352 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.352 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.352 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.353 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.353 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.353 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.353 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.353 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.353 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.353 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.354 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.354 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.354 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.354 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.354 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.354 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.355 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.355 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.355 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.355 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.355 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.355 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.355 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.356 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.356 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.356 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.356 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.356 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.356 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.356 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.357 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.357 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.357 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.357 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.357 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.357 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.358 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.358 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.358 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.358 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.358 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.358 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.358 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.359 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.359 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.359 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.359 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.359 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.359 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.359 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.359 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.360 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.360 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.360 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.360 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.360 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.360 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.360 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.361 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.361 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.361 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.361 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.361 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.361 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.362 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.362 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.362 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.362 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.362 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.362 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.362 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.363 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.363 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.363 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.363 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.363 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.363 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.363 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.364 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.364 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.364 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.364 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.364 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.364 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.364 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.364 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.365 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.365 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.365 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.365 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.365 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.366 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.366 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.366 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.366 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.366 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.366 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.366 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.367 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.367 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.367 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.367 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.367 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.367 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.367 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.368 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.368 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.368 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.368 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.368 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.368 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.368 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.369 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.369 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.369 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.369 186313 DEBUG oslo_service.service [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.370 186313 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.383 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.384 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.384 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.385 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 28 11:11:01 np0005538960 systemd[1]: Starting libvirt QEMU daemon...
Nov 28 11:11:01 np0005538960 systemd[1]: Started libvirt QEMU daemon.
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.467 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f64098e8af0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.471 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f64098e8af0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.472 186313 INFO nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.485 186313 WARNING nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 28 11:11:01 np0005538960 nova_compute[186309]: 2025-11-28 16:11:01.485 186313 DEBUG nova.virt.libvirt.volume.mount [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 28 11:11:01 np0005538960 python3.9[186775]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.465 186313 INFO nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <host>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <uuid>514cc562-2bfa-41c3-bde8-d8f80e6fac29</uuid>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <arch>x86_64</arch>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model>EPYC-Rome-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <vendor>AMD</vendor>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <microcode version='16777317'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <signature family='23' model='49' stepping='0'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='x2apic'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='tsc-deadline'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='osxsave'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='hypervisor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='tsc_adjust'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='spec-ctrl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='stibp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='arch-capabilities'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='cmp_legacy'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='topoext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='virt-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='lbrv'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='tsc-scale'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='vmcb-clean'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='pause-filter'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='pfthreshold'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='svme-addr-chk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='rdctl-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='skip-l1dfl-vmentry'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='mds-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature name='pschange-mc-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <pages unit='KiB' size='4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <pages unit='KiB' size='2048'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <pages unit='KiB' size='1048576'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <power_management>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <suspend_mem/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <suspend_disk/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <suspend_hybrid/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </power_management>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <iommu support='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <migration_features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <live/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <uri_transports>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <uri_transport>tcp</uri_transport>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <uri_transport>rdma</uri_transport>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </uri_transports>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </migration_features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <topology>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <cells num='1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <cell id='0'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:          <memory unit='KiB'>7864316</memory>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:          <pages unit='KiB' size='4'>1966079</pages>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:          <pages unit='KiB' size='2048'>0</pages>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:          <distances>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <sibling id='0' value='10'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:          </distances>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:          <cpus num='8'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:          </cpus>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        </cell>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </cells>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </topology>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <cache>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </cache>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <secmodel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model>selinux</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <doi>0</doi>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </secmodel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <secmodel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model>dac</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <doi>0</doi>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </secmodel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </host>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <guest>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <os_type>hvm</os_type>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <arch name='i686'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <wordsize>32</wordsize>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <domain type='qemu'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <domain type='kvm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </arch>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <pae/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <nonpae/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <acpi default='on' toggle='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <apic default='on' toggle='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <cpuselection/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <deviceboot/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <disksnapshot default='on' toggle='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <externalSnapshot/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </guest>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <guest>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <os_type>hvm</os_type>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <arch name='x86_64'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <wordsize>64</wordsize>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <domain type='qemu'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <domain type='kvm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </arch>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <acpi default='on' toggle='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <apic default='on' toggle='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <cpuselection/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <deviceboot/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <disksnapshot default='on' toggle='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <externalSnapshot/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </guest>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 
Nov 28 11:11:02 np0005538960 nova_compute[186309]: </capabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: #033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.472 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.491 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 11:11:02 np0005538960 nova_compute[186309]: <domainCapabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <domain>kvm</domain>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <arch>i686</arch>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <vcpu max='240'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <iothreads supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <os supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <enum name='firmware'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <loader supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>rom</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pflash</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='readonly'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>yes</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>no</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='secure'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>no</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </loader>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </os>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='host-passthrough' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='hostPassthroughMigratable'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>on</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>off</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='maximum' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='maximumMigratable'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>on</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>off</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='host-model' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <vendor>AMD</vendor>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='x2apic'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='hypervisor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='stibp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='overflow-recov'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='succor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='lbrv'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc-scale'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='flushbyasid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='pause-filter'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='pfthreshold'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='disable' name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='custom' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Dhyana-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Genoa'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='auto-ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='auto-ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-128'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-256'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-512'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v6'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v7'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='KnightsMill'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512er'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512pf'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='KnightsMill-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512er'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512pf'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G4-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tbm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G5-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tbm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SierraForest'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cmpccxadd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SierraForest-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cmpccxadd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='athlon'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='athlon-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='core2duo'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='core2duo-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='coreduo'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='coreduo-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='n270'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='n270-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='phenom'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='phenom-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <memoryBacking supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <enum name='sourceType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>file</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>anonymous</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>memfd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </memoryBacking>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <devices>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <disk supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='diskDevice'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>disk</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>cdrom</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>floppy</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>lun</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='bus'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ide</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>fdc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>scsi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>sata</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-non-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </disk>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <graphics supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vnc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>egl-headless</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dbus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </graphics>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <video supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='modelType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vga</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>cirrus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>none</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>bochs</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ramfb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </video>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <hostdev supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='mode'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>subsystem</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='startupPolicy'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>default</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>mandatory</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>requisite</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>optional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='subsysType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pci</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>scsi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='capsType'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='pciBackend'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </hostdev>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <rng supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-non-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>random</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>egd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>builtin</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </rng>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <filesystem supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='driverType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>path</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>handle</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtiofs</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </filesystem>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <tpm supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tpm-tis</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tpm-crb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>emulator</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>external</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendVersion'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>2.0</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </tpm>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <redirdev supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='bus'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </redirdev>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <channel supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pty</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>unix</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </channel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <crypto supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>qemu</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>builtin</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </crypto>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <interface supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>default</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>passt</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </interface>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <panic supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>isa</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>hyperv</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </panic>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <console supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>null</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pty</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dev</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>file</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pipe</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>stdio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>udp</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tcp</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>unix</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>qemu-vdagent</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dbus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </console>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </devices>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <gic supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <vmcoreinfo supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <genid supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <backingStoreInput supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <backup supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <async-teardown supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <ps2 supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <sev supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <sgx supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <hyperv supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='features'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>relaxed</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vapic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>spinlocks</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vpindex</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>runtime</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>synic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>stimer</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>reset</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vendor_id</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>frequencies</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>reenlightenment</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tlbflush</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ipi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>avic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>emsr_bitmap</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>xmm_input</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <defaults>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <spinlocks>4095</spinlocks>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <stimer_direct>on</stimer_direct>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </defaults>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </hyperv>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <launchSecurity supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='sectype'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tdx</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </launchSecurity>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: </domainCapabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.497 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 11:11:02 np0005538960 nova_compute[186309]: <domainCapabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <domain>kvm</domain>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <arch>i686</arch>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <vcpu max='4096'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <iothreads supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <os supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <enum name='firmware'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <loader supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>rom</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pflash</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='readonly'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>yes</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>no</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='secure'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>no</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </loader>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </os>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='host-passthrough' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='hostPassthroughMigratable'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>on</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>off</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='maximum' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='maximumMigratable'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>on</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>off</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='host-model' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <vendor>AMD</vendor>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='x2apic'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='hypervisor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='stibp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='overflow-recov'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='succor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='lbrv'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc-scale'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='flushbyasid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='pause-filter'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='pfthreshold'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='disable' name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='custom' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Dhyana-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Genoa'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='auto-ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='auto-ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-128'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-256'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-512'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v6'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v7'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='KnightsMill'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512er'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512pf'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='KnightsMill-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512er'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512pf'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G4-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tbm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G5-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tbm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SierraForest'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cmpccxadd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SierraForest-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cmpccxadd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='athlon'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='athlon-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='core2duo'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='core2duo-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='coreduo'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='coreduo-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='n270'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='n270-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='phenom'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='phenom-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <memoryBacking supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <enum name='sourceType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>file</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>anonymous</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>memfd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </memoryBacking>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <devices>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <disk supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='diskDevice'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>disk</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>cdrom</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>floppy</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>lun</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='bus'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>fdc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>scsi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>sata</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-non-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </disk>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <graphics supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vnc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>egl-headless</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dbus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </graphics>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <video supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='modelType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vga</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>cirrus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>none</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>bochs</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ramfb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </video>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <hostdev supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='mode'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>subsystem</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='startupPolicy'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>default</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>mandatory</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>requisite</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>optional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='subsysType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pci</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>scsi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='capsType'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='pciBackend'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </hostdev>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <rng supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-non-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>random</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>egd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>builtin</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </rng>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <filesystem supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='driverType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>path</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>handle</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtiofs</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </filesystem>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <tpm supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tpm-tis</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tpm-crb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>emulator</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>external</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendVersion'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>2.0</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </tpm>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <redirdev supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='bus'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </redirdev>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <channel supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pty</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>unix</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </channel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <crypto supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>qemu</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>builtin</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </crypto>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <interface supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>default</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>passt</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </interface>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <panic supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>isa</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>hyperv</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </panic>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <console supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>null</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pty</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dev</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>file</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pipe</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>stdio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>udp</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tcp</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>unix</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>qemu-vdagent</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dbus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </console>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </devices>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <gic supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <vmcoreinfo supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <genid supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <backingStoreInput supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <backup supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <async-teardown supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <ps2 supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <sev supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <sgx supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <hyperv supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='features'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>relaxed</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vapic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>spinlocks</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vpindex</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>runtime</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>synic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>stimer</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>reset</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vendor_id</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>frequencies</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>reenlightenment</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tlbflush</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ipi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>avic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>emsr_bitmap</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>xmm_input</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <defaults>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <spinlocks>4095</spinlocks>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <stimer_direct>on</stimer_direct>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </defaults>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </hyperv>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <launchSecurity supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='sectype'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tdx</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </launchSecurity>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: </domainCapabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.526 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.530 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 11:11:02 np0005538960 nova_compute[186309]: <domainCapabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <domain>kvm</domain>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <arch>x86_64</arch>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <vcpu max='240'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <iothreads supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <os supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <enum name='firmware'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <loader supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>rom</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pflash</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='readonly'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>yes</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>no</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='secure'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>no</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </loader>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </os>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='host-passthrough' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='hostPassthroughMigratable'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>on</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>off</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='maximum' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='maximumMigratable'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>on</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>off</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='host-model' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <vendor>AMD</vendor>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='x2apic'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='hypervisor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='stibp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='overflow-recov'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='succor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='lbrv'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc-scale'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='flushbyasid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='pause-filter'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='pfthreshold'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='disable' name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='custom' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Dhyana-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Genoa'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='auto-ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='auto-ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-128'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-256'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-512'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v6'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v7'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='KnightsMill'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512er'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512pf'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='KnightsMill-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512er'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512pf'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G4-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tbm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G5-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tbm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SierraForest'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cmpccxadd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SierraForest-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cmpccxadd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='athlon'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='athlon-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='core2duo'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='core2duo-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='coreduo'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='coreduo-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='n270'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='n270-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='phenom'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='phenom-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <memoryBacking supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <enum name='sourceType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>file</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>anonymous</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>memfd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </memoryBacking>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <devices>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <disk supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='diskDevice'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>disk</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>cdrom</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>floppy</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>lun</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='bus'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ide</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>fdc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>scsi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>sata</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-non-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </disk>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <graphics supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vnc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>egl-headless</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dbus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </graphics>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <video supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='modelType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vga</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>cirrus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>none</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>bochs</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ramfb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </video>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <hostdev supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='mode'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>subsystem</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='startupPolicy'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>default</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>mandatory</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>requisite</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>optional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='subsysType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pci</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>scsi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='capsType'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='pciBackend'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </hostdev>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <rng supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-non-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>random</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>egd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>builtin</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </rng>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <filesystem supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='driverType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>path</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>handle</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtiofs</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </filesystem>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <tpm supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tpm-tis</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tpm-crb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>emulator</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>external</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendVersion'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>2.0</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </tpm>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <redirdev supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='bus'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </redirdev>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <channel supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pty</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>unix</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </channel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <crypto supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>qemu</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>builtin</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </crypto>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <interface supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>default</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>passt</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </interface>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <panic supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>isa</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>hyperv</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </panic>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <console supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>null</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pty</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dev</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>file</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pipe</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>stdio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>udp</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tcp</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>unix</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>qemu-vdagent</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dbus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </console>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </devices>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <gic supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <vmcoreinfo supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <genid supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <backingStoreInput supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <backup supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <async-teardown supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <ps2 supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <sev supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <sgx supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <hyperv supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='features'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>relaxed</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vapic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>spinlocks</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vpindex</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>runtime</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>synic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>stimer</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>reset</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vendor_id</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>frequencies</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>reenlightenment</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tlbflush</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ipi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>avic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>emsr_bitmap</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>xmm_input</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <defaults>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <spinlocks>4095</spinlocks>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <stimer_direct>on</stimer_direct>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </defaults>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </hyperv>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <launchSecurity supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='sectype'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tdx</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </launchSecurity>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: </domainCapabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.593 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 11:11:02 np0005538960 nova_compute[186309]: <domainCapabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <domain>kvm</domain>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <arch>x86_64</arch>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <vcpu max='4096'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <iothreads supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <os supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <enum name='firmware'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>efi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <loader supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>rom</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pflash</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='readonly'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>yes</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>no</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='secure'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>yes</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>no</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </loader>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </os>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='host-passthrough' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='hostPassthroughMigratable'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>on</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>off</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='maximum' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='maximumMigratable'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>on</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>off</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='host-model' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <vendor>AMD</vendor>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='x2apic'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='hypervisor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='stibp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='overflow-recov'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='succor'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='lbrv'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='tsc-scale'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='flushbyasid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='pause-filter'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='pfthreshold'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <feature policy='disable' name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <mode name='custom' supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Broadwell-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Cooperlake-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Denverton-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Dhyana-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Genoa'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='auto-ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='auto-ibrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Milan-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amd-psfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='stibp-always-on'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-Rome-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='EPYC-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='GraniteRapids-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-128'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-256'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx10-512'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='prefetchiti'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Haswell-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v6'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Icelake-Server-v7'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='IvyBridge-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='KnightsMill'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512er'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512pf'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='KnightsMill-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512er'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512pf'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G4-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tbm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Opteron_G5-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fma4'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tbm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xop'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SapphireRapids-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='amx-tile'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-bf16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-fp16'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bitalg'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrc'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fzrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='la57'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='taa-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xfd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SierraForest'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cmpccxadd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='SierraForest-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ifma'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cmpccxadd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fbsdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='fsrs'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ibrs-all'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mcdt-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pbrsb-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='psdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='serialize'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vaes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Client-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='hle'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='rtm'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Skylake-Server-v5'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512bw'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512cd'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512dq'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512f'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='avx512vl'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='invpcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pcid'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='pku'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='mpx'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v2'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v3'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='core-capability'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='split-lock-detect'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='Snowridge-v4'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='cldemote'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='erms'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='gfni'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdir64b'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='movdiri'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='xsaves'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='athlon'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='athlon-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='core2duo'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='core2duo-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='coreduo'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='coreduo-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='n270'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='n270-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='ss'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='phenom'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <blockers model='phenom-v1'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnow'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <feature name='3dnowext'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </blockers>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </mode>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <memoryBacking supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <enum name='sourceType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>file</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>anonymous</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <value>memfd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </memoryBacking>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <devices>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <disk supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='diskDevice'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>disk</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>cdrom</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>floppy</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>lun</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='bus'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>fdc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>scsi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>sata</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-non-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </disk>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <graphics supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vnc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>egl-headless</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dbus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </graphics>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <video supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='modelType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vga</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>cirrus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>none</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>bochs</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ramfb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </video>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <hostdev supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='mode'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>subsystem</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='startupPolicy'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>default</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>mandatory</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>requisite</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>optional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='subsysType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pci</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>scsi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='capsType'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='pciBackend'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </hostdev>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <rng supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtio-non-transitional</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>random</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>egd</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>builtin</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </rng>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <filesystem supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='driverType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>path</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>handle</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>virtiofs</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </filesystem>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <tpm supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tpm-tis</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tpm-crb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>emulator</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>external</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendVersion'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>2.0</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </tpm>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <redirdev supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='bus'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>usb</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </redirdev>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <channel supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pty</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>unix</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </channel>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <crypto supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>qemu</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendModel'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>builtin</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </crypto>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <interface supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='backendType'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>default</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>passt</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </interface>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <panic supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='model'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>isa</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>hyperv</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </panic>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <console supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='type'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>null</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vc</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pty</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dev</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>file</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>pipe</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>stdio</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>udp</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tcp</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>unix</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>qemu-vdagent</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>dbus</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </console>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </devices>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <gic supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <vmcoreinfo supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <genid supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <backingStoreInput supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <backup supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <async-teardown supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <ps2 supported='yes'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <sev supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <sgx supported='no'/>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <hyperv supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='features'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>relaxed</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vapic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>spinlocks</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vpindex</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>runtime</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>synic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>stimer</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>reset</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>vendor_id</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>frequencies</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>reenlightenment</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tlbflush</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>ipi</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>avic</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>emsr_bitmap</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>xmm_input</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <defaults>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <spinlocks>4095</spinlocks>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <stimer_direct>on</stimer_direct>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </defaults>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </hyperv>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    <launchSecurity supported='yes'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      <enum name='sectype'>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:        <value>tdx</value>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:      </enum>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:    </launchSecurity>
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  </features>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: </domainCapabilities>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.659 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.660 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.660 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.660 186313 INFO nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Secure Boot support detected#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.662 186313 INFO nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.662 186313 INFO nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.672 186313 DEBUG nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] cpu compare xml: <cpu match="exact">
Nov 28 11:11:02 np0005538960 nova_compute[186309]:  <model>Nehalem</model>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: </cpu>
Nov 28 11:11:02 np0005538960 nova_compute[186309]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.675 186313 DEBUG nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.716 186313 INFO nova.virt.node [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Determined node identity 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from /var/lib/nova/compute_id#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.745 186313 WARNING nova.compute.manager [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Compute nodes ['65f0ce30-d9ca-4c16-b536-acd92f5f41ce'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.780 186313 INFO nova.compute.manager [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.818 186313 WARNING nova.compute.manager [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.819 186313 DEBUG oslo_concurrency.lockutils [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.819 186313 DEBUG oslo_concurrency.lockutils [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.819 186313 DEBUG oslo_concurrency.lockutils [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:11:02 np0005538960 nova_compute[186309]: 2025-11-28 16:11:02.819 186313 DEBUG nova.compute.resource_tracker [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:11:02 np0005538960 systemd[1]: Starting libvirt nodedev daemon...
Nov 28 11:11:02 np0005538960 systemd[1]: Started libvirt nodedev daemon.
Nov 28 11:11:02 np0005538960 python3.9[186991]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.138 186313 WARNING nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.140 186313 DEBUG nova.compute.resource_tracker [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6210MB free_disk=73.54626083374023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.140 186313 DEBUG oslo_concurrency.lockutils [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.140 186313 DEBUG oslo_concurrency.lockutils [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.152 186313 WARNING nova.compute.resource_tracker [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] No compute node record for compute-1.ctlplane.example.com:65f0ce30-d9ca-4c16-b536-acd92f5f41ce: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 65f0ce30-d9ca-4c16-b536-acd92f5f41ce could not be found.#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.176 186313 INFO nova.compute.resource_tracker [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.235 186313 DEBUG nova.compute.resource_tracker [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.235 186313 DEBUG nova.compute.resource_tracker [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:11:03 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.391 186313 INFO nova.scheduler.client.report [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] [req-00630ba9-dfe8-4b45-9c6b-b310c96e6847] Created resource provider record via placement API for resource provider with UUID 65f0ce30-d9ca-4c16-b536-acd92f5f41ce and name compute-1.ctlplane.example.com.#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.461 186313 DEBUG nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 11:11:03 np0005538960 nova_compute[186309]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.461 186313 INFO nova.virt.libvirt.host [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.462 186313 DEBUG nova.compute.provider_tree [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.462 186313 DEBUG nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.465 186313 DEBUG nova.virt.libvirt.driver [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Libvirt baseline CPU <cpu>
Nov 28 11:11:03 np0005538960 nova_compute[186309]:  <arch>x86_64</arch>
Nov 28 11:11:03 np0005538960 nova_compute[186309]:  <model>Nehalem</model>
Nov 28 11:11:03 np0005538960 nova_compute[186309]:  <vendor>AMD</vendor>
Nov 28 11:11:03 np0005538960 nova_compute[186309]:  <topology sockets="8" cores="1" threads="1"/>
Nov 28 11:11:03 np0005538960 nova_compute[186309]: </cpu>
Nov 28 11:11:03 np0005538960 nova_compute[186309]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.519 186313 DEBUG nova.scheduler.client.report [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Updated inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.520 186313 DEBUG nova.compute.provider_tree [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Updating resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.520 186313 DEBUG nova.compute.provider_tree [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.612 186313 DEBUG nova.compute.provider_tree [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Updating resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.631 186313 DEBUG nova.compute.resource_tracker [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.632 186313 DEBUG oslo_concurrency.lockutils [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.491s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.632 186313 DEBUG nova.service [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.685 186313 DEBUG nova.service [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 28 11:11:03 np0005538960 nova_compute[186309]: 2025-11-28 16:11:03.686 186313 DEBUG nova.servicegroup.drivers.db [None req-44139ce8-fdda-4e34-be1d-65e98ea25aae - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 28 11:11:03 np0005538960 python3.9[187188]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:11:04 np0005538960 systemd[1]: Stopping nova_compute container...
Nov 28 11:11:04 np0005538960 nova_compute[186309]: 2025-11-28 16:11:04.226 186313 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m
Nov 28 11:11:04 np0005538960 nova_compute[186309]: 2025-11-28 16:11:04.229 186313 DEBUG oslo_concurrency.lockutils [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:11:04 np0005538960 nova_compute[186309]: 2025-11-28 16:11:04.229 186313 DEBUG oslo_concurrency.lockutils [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:11:04 np0005538960 nova_compute[186309]: 2025-11-28 16:11:04.230 186313 DEBUG oslo_concurrency.lockutils [None req-f97a9fd0-6be2-45ba-b72a-4f889e11b579 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:11:04 np0005538960 virtqemud[186797]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Nov 28 11:11:04 np0005538960 virtqemud[186797]: hostname: compute-1
Nov 28 11:11:04 np0005538960 virtqemud[186797]: End of file while reading data: Input/output error
Nov 28 11:11:04 np0005538960 systemd[1]: libpod-6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224.scope: Deactivated successfully.
Nov 28 11:11:04 np0005538960 systemd[1]: libpod-6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224.scope: Consumed 3.653s CPU time.
Nov 28 11:11:04 np0005538960 podman[187192]: 2025-11-28 16:11:04.733051969 +0000 UTC m=+0.711993219 container died 6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:11:04 np0005538960 systemd[1]: var-lib-containers-storage-overlay-4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98-merged.mount: Deactivated successfully.
Nov 28 11:11:04 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224-userdata-shm.mount: Deactivated successfully.
Nov 28 11:11:04 np0005538960 podman[187192]: 2025-11-28 16:11:04.785330222 +0000 UTC m=+0.764271472 container cleanup 6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Nov 28 11:11:04 np0005538960 podman[187192]: nova_compute
Nov 28 11:11:04 np0005538960 podman[187223]: nova_compute
Nov 28 11:11:04 np0005538960 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Nov 28 11:11:04 np0005538960 systemd[1]: Stopped nova_compute container.
Nov 28 11:11:04 np0005538960 systemd[1]: Starting nova_compute container...
Nov 28 11:11:04 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:11:04 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Nov 28 11:11:04 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Nov 28 11:11:04 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Nov 28 11:11:04 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Nov 28 11:11:04 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ebd6076b7035939ee3713822604917e959ab37ea22d232c2411077fb5183c98/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 11:11:04 np0005538960 podman[187236]: 2025-11-28 16:11:04.979766705 +0000 UTC m=+0.091474646 container init 6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=nova_compute)
Nov 28 11:11:04 np0005538960 podman[187236]: 2025-11-28 16:11:04.984942797 +0000 UTC m=+0.096650728 container start 6c636f4e46cac6bc79d1c12c20d169405a361dbe7a4b183763b35ccecbd8c224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3)
Nov 28 11:11:04 np0005538960 podman[187236]: nova_compute
Nov 28 11:11:04 np0005538960 nova_compute[187252]: + sudo -E kolla_set_configs
Nov 28 11:11:04 np0005538960 systemd[1]: Started nova_compute container.
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Validating config file
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying service configuration files
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /etc/nova/nova.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /etc/ceph
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Creating directory /etc/ceph
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /etc/ceph
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Writing out command to execute
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Nov 28 11:11:05 np0005538960 nova_compute[187252]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Nov 28 11:11:05 np0005538960 nova_compute[187252]: ++ cat /run_command
Nov 28 11:11:05 np0005538960 nova_compute[187252]: + CMD=nova-compute
Nov 28 11:11:05 np0005538960 nova_compute[187252]: + ARGS=
Nov 28 11:11:05 np0005538960 nova_compute[187252]: + sudo kolla_copy_cacerts
Nov 28 11:11:05 np0005538960 nova_compute[187252]: + [[ ! -n '' ]]
Nov 28 11:11:05 np0005538960 nova_compute[187252]: + . kolla_extend_start
Nov 28 11:11:05 np0005538960 nova_compute[187252]: Running command: 'nova-compute'
Nov 28 11:11:05 np0005538960 nova_compute[187252]: + echo 'Running command: '\''nova-compute'\'''
Nov 28 11:11:05 np0005538960 nova_compute[187252]: + umask 0022
Nov 28 11:11:05 np0005538960 nova_compute[187252]: + exec nova-compute
Nov 28 11:11:05 np0005538960 python3.9[187415]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Nov 28 11:11:06 np0005538960 systemd[1]: Started libpod-conmon-401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee.scope.
Nov 28 11:11:06 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:11:06 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cad54f6c91fb9085982c35192ac0c6f2f018cf2af5fd133a9e61bf87e3159065/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Nov 28 11:11:06 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cad54f6c91fb9085982c35192ac0c6f2f018cf2af5fd133a9e61bf87e3159065/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Nov 28 11:11:06 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cad54f6c91fb9085982c35192ac0c6f2f018cf2af5fd133a9e61bf87e3159065/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Nov 28 11:11:06 np0005538960 podman[187441]: 2025-11-28 16:11:06.17863096 +0000 UTC m=+0.150509431 container init 401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm)
Nov 28 11:11:06 np0005538960 podman[187441]: 2025-11-28 16:11:06.188229084 +0000 UTC m=+0.160107455 container start 401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:11:06 np0005538960 python3.9[187415]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Applying nova statedir ownership
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Nov 28 11:11:06 np0005538960 nova_compute_init[187462]: INFO:nova_statedir:Nova statedir ownership complete
Nov 28 11:11:06 np0005538960 systemd[1]: libpod-401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee.scope: Deactivated successfully.
Nov 28 11:11:06 np0005538960 podman[187476]: 2025-11-28 16:11:06.280370197 +0000 UTC m=+0.023422515 container died 401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:11:06 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee-userdata-shm.mount: Deactivated successfully.
Nov 28 11:11:06 np0005538960 systemd[1]: var-lib-containers-storage-overlay-cad54f6c91fb9085982c35192ac0c6f2f018cf2af5fd133a9e61bf87e3159065-merged.mount: Deactivated successfully.
Nov 28 11:11:06 np0005538960 podman[187476]: 2025-11-28 16:11:06.312256974 +0000 UTC m=+0.055309282 container cleanup 401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 11:11:06 np0005538960 systemd[1]: libpod-conmon-401a51a2bda80a10ee7835dfe5ba0c351125c6832d283df0d74f7a4b6633e6ee.scope: Deactivated successfully.
Nov 28 11:11:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:11:06.329 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:11:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:11:06.330 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:11:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:11:06.330 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:11:06 np0005538960 systemd[1]: session-25.scope: Deactivated successfully.
Nov 28 11:11:06 np0005538960 systemd[1]: session-25.scope: Consumed 2min 12.152s CPU time.
Nov 28 11:11:06 np0005538960 systemd-logind[788]: Session 25 logged out. Waiting for processes to exit.
Nov 28 11:11:06 np0005538960 systemd-logind[788]: Removed session 25.
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.118 187256 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.119 187256 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.119 187256 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.119 187256 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.275 187256 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.305 187256 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.305 187256 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.730 187256 INFO nova.virt.driver [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.836 187256 INFO nova.compute.provider_config [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.901 187256 DEBUG oslo_concurrency.lockutils [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.901 187256 DEBUG oslo_concurrency.lockutils [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.902 187256 DEBUG oslo_concurrency.lockutils [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.902 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.902 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.902 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.902 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.903 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.903 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.903 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.903 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.903 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.903 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.903 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.904 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.904 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.904 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.904 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.904 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.905 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.905 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.905 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.905 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.905 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.905 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.906 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.906 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.906 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.906 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.906 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.907 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.907 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.907 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.907 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.908 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.908 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.908 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.908 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.908 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.909 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.909 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.909 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.909 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.909 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.910 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.910 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.910 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.910 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.910 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.911 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.911 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.911 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.911 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.911 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.912 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.912 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.912 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.912 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.912 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.912 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.912 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.913 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.913 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.913 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.913 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.913 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.913 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.913 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.913 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.914 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.914 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.914 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.914 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.914 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.914 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.914 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.915 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.915 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.915 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.915 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.915 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.915 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.915 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.916 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.916 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.916 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.916 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.916 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.916 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.916 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.917 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.917 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.917 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.917 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.917 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.917 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.917 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.918 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.918 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.918 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.918 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.918 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.918 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.918 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.919 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.919 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.919 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.919 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.919 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.919 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.919 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.920 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.920 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.920 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.920 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.920 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.920 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.920 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.921 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.921 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.921 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.921 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.921 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.921 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.921 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.921 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.922 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.922 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.922 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.922 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.922 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.922 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.922 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.923 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.923 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.923 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.923 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.923 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.923 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.923 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.924 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.924 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.924 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.924 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.924 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.924 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.924 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.925 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.925 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.925 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.925 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.925 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.925 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.925 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.926 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.926 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.926 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.926 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.926 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.926 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.927 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.927 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.927 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.927 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.927 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.927 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.928 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.928 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.928 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.928 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.928 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.928 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.929 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.929 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.929 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.929 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.929 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.929 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.930 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.930 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.930 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.930 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.930 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.930 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.930 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.931 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.931 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.931 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.931 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.931 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.931 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.932 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.932 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.932 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.932 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.932 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.932 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.933 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.933 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.933 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.933 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.933 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.933 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.934 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.934 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.934 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.934 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.934 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.934 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.935 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.935 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.935 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.935 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.935 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.935 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.935 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.936 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.936 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.936 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.936 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.936 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.936 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.937 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.937 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.937 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.937 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.937 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.937 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.937 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.938 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.938 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.938 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.938 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.938 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.938 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.939 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.939 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.939 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.939 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.939 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.939 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.940 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.940 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.940 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.940 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.940 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.940 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.941 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.941 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.941 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.941 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.941 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.941 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.942 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.942 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.942 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.942 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.942 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.942 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.943 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.943 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.943 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.943 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.943 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.943 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.944 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.944 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.944 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.944 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.944 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.944 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.945 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.945 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.945 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.945 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.945 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.945 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.946 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.946 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.946 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.946 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.946 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.946 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.946 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.947 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.947 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.947 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.947 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.947 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.947 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.948 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.948 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.948 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.948 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.948 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.948 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.948 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.949 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.949 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.949 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.949 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.949 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.949 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.950 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.950 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.950 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.950 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.950 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.950 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.951 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.951 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.951 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.951 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.951 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.951 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.952 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.952 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.952 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.952 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.952 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.952 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.953 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.953 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.953 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.953 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.953 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.953 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.954 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.954 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.954 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.954 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.954 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.954 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.955 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.955 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.955 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.955 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.955 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.955 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.956 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.956 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.956 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.956 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.956 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.956 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.957 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.957 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.957 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.957 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.957 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.957 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.958 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.958 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.958 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.958 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.958 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.958 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.959 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.959 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.959 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.959 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.959 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.960 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.960 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.960 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.960 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.960 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.960 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.961 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.961 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.961 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.961 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.961 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.961 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.962 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.962 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.962 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.962 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.962 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.962 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.962 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.963 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.963 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.963 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.963 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.963 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.963 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.964 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.964 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.964 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.964 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.964 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.964 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.965 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.965 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.965 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.965 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.965 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.965 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.966 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.966 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.966 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.966 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.966 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.966 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.967 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.967 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.967 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.967 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.967 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.967 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.968 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.968 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.968 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.968 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.968 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.968 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.968 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.969 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.969 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.969 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.969 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.969 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.969 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.970 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.970 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.970 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.970 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.970 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.970 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.971 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.971 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.971 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.971 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.971 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.971 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.972 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.972 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.972 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.972 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.972 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.972 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.973 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.973 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.973 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.973 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.973 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.973 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.974 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.974 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.974 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.974 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.974 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.974 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.974 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.975 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.975 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.975 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.975 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.975 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.976 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.976 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.976 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.976 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.976 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.976 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.977 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.977 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.977 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.977 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.977 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.977 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.978 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.978 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.978 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.978 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.978 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.978 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.979 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.979 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.979 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.979 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.979 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.979 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.980 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.980 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.980 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.980 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.980 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.980 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.981 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.981 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.981 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.981 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.981 187256 WARNING oslo_config.cfg [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Nov 28 11:11:07 np0005538960 nova_compute[187252]: live_migration_uri is deprecated for removal in favor of two other options that
Nov 28 11:11:07 np0005538960 nova_compute[187252]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Nov 28 11:11:07 np0005538960 nova_compute[187252]: and ``live_migration_inbound_addr`` respectively.
Nov 28 11:11:07 np0005538960 nova_compute[187252]: ).  Its value may be silently ignored in the future.#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.982 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.982 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.982 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.982 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.982 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.983 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.983 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.983 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.983 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.983 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.983 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.984 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.984 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.984 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.985 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.985 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.985 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.985 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.985 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.985 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.985 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.986 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.986 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.986 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.986 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.986 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.986 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.987 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.987 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.987 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.987 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.987 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.988 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.988 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.988 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.988 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.988 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.988 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.989 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.989 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.989 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.989 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.989 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.989 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.989 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.990 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.990 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.990 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.990 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.990 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.990 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.991 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.991 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.991 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.991 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.991 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.991 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.992 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.992 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.992 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.992 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.992 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.992 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.993 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.993 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.993 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.993 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.993 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.993 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.993 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.994 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.994 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.994 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.994 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.994 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.994 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.995 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.995 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.995 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.995 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.995 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.995 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.996 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.996 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.996 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.996 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.996 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.996 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.997 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.997 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.997 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.997 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.997 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.997 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.998 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.998 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.998 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.998 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.998 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.998 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.999 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.999 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.999 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.999 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:07 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.999 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.999 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:07.999 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.000 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.000 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.000 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.000 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.000 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.001 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.001 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.001 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.001 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.001 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.001 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.001 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.002 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.002 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.002 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.002 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.002 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.002 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.003 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.003 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.003 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.003 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.003 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.003 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.004 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.004 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.004 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.004 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.004 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.004 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.005 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.005 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.005 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.005 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.005 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.005 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.006 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.006 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.006 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.006 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.006 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.006 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.006 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.007 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.007 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.007 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.007 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.007 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.008 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.008 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.008 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.008 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.008 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.009 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.009 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.009 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.009 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.009 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.009 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.010 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.010 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.010 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.010 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.010 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.010 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.011 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.011 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.011 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.011 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.011 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.011 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.012 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.012 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.012 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.012 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.012 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.012 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.013 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.013 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.013 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.013 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.013 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.013 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.013 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.014 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.014 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.014 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.014 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.014 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.014 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.015 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.015 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.015 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.015 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.015 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.015 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.015 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.016 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.016 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.016 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.016 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.016 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.017 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.017 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.017 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.017 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.017 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.017 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.017 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.018 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.018 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.018 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.018 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.018 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.018 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.019 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.019 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.019 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.019 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.019 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.019 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.020 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.020 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.020 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.020 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.020 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.020 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.021 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.021 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.021 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.021 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.021 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.021 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.022 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.022 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.022 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.022 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.022 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.022 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.023 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.023 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.023 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.023 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.023 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.024 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.024 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.024 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.024 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.024 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.025 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.025 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.025 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.025 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.025 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.025 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.026 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.026 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.026 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.026 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.026 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.027 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.027 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.027 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.027 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.027 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.027 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.028 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.028 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.028 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.028 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.028 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.028 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.029 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.029 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.029 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.029 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.029 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.029 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.030 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.030 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.030 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.030 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.030 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.030 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.031 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.031 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.031 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.031 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.031 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.032 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.032 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.032 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.032 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.032 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.032 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.033 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.033 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.033 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.033 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.033 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.034 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.034 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.034 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.034 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.034 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.034 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.035 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.035 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.035 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.035 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.035 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.036 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.036 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.036 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.036 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.036 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.037 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.037 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.037 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.037 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.037 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.038 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.038 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.038 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.038 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.038 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.038 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.039 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.039 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.039 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.039 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.039 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.040 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.040 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.040 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.040 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.040 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.041 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.041 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.041 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.041 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.041 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.042 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.042 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.042 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.042 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.042 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.042 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.042 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.043 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.043 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.043 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.043 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.043 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.043 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.043 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.044 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.044 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.044 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.044 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.044 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.044 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.045 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.045 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.045 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.045 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.045 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.045 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.045 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.045 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.046 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.046 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.046 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.046 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.046 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.046 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.046 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.047 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.047 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.047 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.047 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.047 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.047 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.048 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.048 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.048 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.048 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.048 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.048 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.049 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.049 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.049 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.049 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.049 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.049 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.049 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.050 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.050 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.050 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.050 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.050 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.051 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.051 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.051 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.051 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.051 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.051 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.052 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.052 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.052 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.052 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.052 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.052 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.053 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.053 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.053 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.053 187256 DEBUG oslo_service.service [None req-8d928f27-e9a4-49b5-b97b-23d02cf1c697 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.055 187256 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.081 187256 INFO nova.virt.node [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Determined node identity 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from /var/lib/nova/compute_id#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.082 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.083 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.083 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.083 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.099 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4930e40400> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.108 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4930e40400> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.109 187256 INFO nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Connection event '1' reason 'None'#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.115 187256 INFO nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Libvirt host capabilities <capabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <host>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <uuid>514cc562-2bfa-41c3-bde8-d8f80e6fac29</uuid>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <arch>x86_64</arch>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model>EPYC-Rome-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <vendor>AMD</vendor>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <microcode version='16777317'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <signature family='23' model='49' stepping='0'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <maxphysaddr mode='emulate' bits='40'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='x2apic'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='tsc-deadline'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='osxsave'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='hypervisor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='tsc_adjust'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='spec-ctrl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='stibp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='arch-capabilities'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='cmp_legacy'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='topoext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='virt-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='lbrv'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='tsc-scale'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='vmcb-clean'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='pause-filter'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='pfthreshold'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='svme-addr-chk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='rdctl-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='skip-l1dfl-vmentry'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='mds-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature name='pschange-mc-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <pages unit='KiB' size='4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <pages unit='KiB' size='2048'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <pages unit='KiB' size='1048576'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <power_management>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <suspend_mem/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <suspend_disk/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <suspend_hybrid/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </power_management>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <iommu support='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <migration_features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <live/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <uri_transports>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <uri_transport>tcp</uri_transport>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <uri_transport>rdma</uri_transport>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </uri_transports>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </migration_features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <topology>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <cells num='1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <cell id='0'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:          <memory unit='KiB'>7864316</memory>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:          <pages unit='KiB' size='4'>1966079</pages>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:          <pages unit='KiB' size='2048'>0</pages>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:          <pages unit='KiB' size='1048576'>0</pages>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:          <distances>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <sibling id='0' value='10'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:          </distances>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:          <cpus num='8'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:          </cpus>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        </cell>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </cells>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </topology>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <cache>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </cache>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <secmodel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model>selinux</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <doi>0</doi>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </secmodel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <secmodel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model>dac</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <doi>0</doi>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <baselabel type='kvm'>+107:+107</baselabel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <baselabel type='qemu'>+107:+107</baselabel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </secmodel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </host>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <guest>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <os_type>hvm</os_type>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <arch name='i686'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <wordsize>32</wordsize>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <domain type='qemu'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <domain type='kvm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </arch>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <pae/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <nonpae/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <acpi default='on' toggle='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <apic default='on' toggle='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <cpuselection/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <deviceboot/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <disksnapshot default='on' toggle='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <externalSnapshot/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </guest>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <guest>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <os_type>hvm</os_type>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <arch name='x86_64'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <wordsize>64</wordsize>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <domain type='qemu'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <domain type='kvm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </arch>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <acpi default='on' toggle='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <apic default='on' toggle='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <cpuselection/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <deviceboot/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <disksnapshot default='on' toggle='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <externalSnapshot/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </guest>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 
Nov 28 11:11:08 np0005538960 nova_compute[187252]: </capabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: #033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.123 187256 DEBUG nova.virt.libvirt.volume.mount [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.125 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.131 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Nov 28 11:11:08 np0005538960 nova_compute[187252]: <domainCapabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <domain>kvm</domain>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <arch>i686</arch>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <vcpu max='240'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <iothreads supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <os supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <enum name='firmware'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <loader supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>rom</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pflash</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='readonly'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>yes</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>no</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='secure'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>no</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </loader>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='host-passthrough' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='hostPassthroughMigratable'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>on</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>off</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='maximum' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='maximumMigratable'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>on</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>off</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='host-model' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <vendor>AMD</vendor>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='x2apic'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='hypervisor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='stibp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='overflow-recov'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='succor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='lbrv'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc-scale'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='flushbyasid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='pause-filter'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='pfthreshold'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='disable' name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='custom' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Dhyana-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Genoa'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='auto-ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='auto-ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-128'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-256'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-512'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v6'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v7'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='KnightsMill'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512er'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512pf'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='KnightsMill-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512er'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512pf'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G4-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tbm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G5-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tbm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SierraForest'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cmpccxadd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SierraForest-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cmpccxadd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='athlon'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='athlon-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='core2duo'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='core2duo-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='coreduo'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='coreduo-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='n270'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='n270-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='phenom'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='phenom-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <memoryBacking supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <enum name='sourceType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>file</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>anonymous</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>memfd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </memoryBacking>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <disk supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='diskDevice'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>disk</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>cdrom</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>floppy</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>lun</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='bus'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ide</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>fdc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>scsi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>sata</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-non-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <graphics supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vnc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>egl-headless</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dbus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </graphics>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <video supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='modelType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vga</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>cirrus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>none</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>bochs</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ramfb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <hostdev supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='mode'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>subsystem</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='startupPolicy'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>default</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>mandatory</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>requisite</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>optional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='subsysType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pci</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>scsi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='capsType'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='pciBackend'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </hostdev>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <rng supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-non-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>random</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>egd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>builtin</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <filesystem supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='driverType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>path</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>handle</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtiofs</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </filesystem>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <tpm supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tpm-tis</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tpm-crb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>emulator</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>external</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendVersion'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>2.0</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </tpm>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <redirdev supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='bus'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </redirdev>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <channel supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pty</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>unix</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </channel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <crypto supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>qemu</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>builtin</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </crypto>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <interface supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>default</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>passt</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <panic supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>isa</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>hyperv</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </panic>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <console supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>null</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pty</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dev</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>file</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pipe</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>stdio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>udp</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tcp</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>unix</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>qemu-vdagent</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dbus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </console>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <gic supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <vmcoreinfo supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <genid supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <backingStoreInput supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <backup supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <async-teardown supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <ps2 supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <sev supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <sgx supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <hyperv supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='features'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>relaxed</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vapic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>spinlocks</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vpindex</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>runtime</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>synic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>stimer</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>reset</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vendor_id</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>frequencies</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>reenlightenment</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tlbflush</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ipi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>avic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>emsr_bitmap</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>xmm_input</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <defaults>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <spinlocks>4095</spinlocks>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <stimer_direct>on</stimer_direct>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </defaults>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </hyperv>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <launchSecurity supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='sectype'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tdx</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </launchSecurity>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: </domainCapabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.142 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Nov 28 11:11:08 np0005538960 nova_compute[187252]: <domainCapabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <domain>kvm</domain>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <arch>i686</arch>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <vcpu max='4096'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <iothreads supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <os supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <enum name='firmware'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <loader supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>rom</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pflash</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='readonly'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>yes</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>no</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='secure'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>no</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </loader>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='host-passthrough' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='hostPassthroughMigratable'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>on</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>off</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='maximum' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='maximumMigratable'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>on</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>off</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='host-model' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <vendor>AMD</vendor>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='x2apic'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='hypervisor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='stibp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='overflow-recov'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='succor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='lbrv'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc-scale'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='flushbyasid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='pause-filter'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='pfthreshold'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='disable' name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='custom' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Dhyana-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Genoa'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='auto-ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='auto-ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-128'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-256'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-512'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v6'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v7'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='KnightsMill'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512er'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512pf'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='KnightsMill-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512er'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512pf'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G4-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tbm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G5-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tbm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SierraForest'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cmpccxadd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SierraForest-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cmpccxadd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 podman[187532]: 2025-11-28 16:11:08.2208636 +0000 UTC m=+0.114287595 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='athlon'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='athlon-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='core2duo'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='core2duo-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='coreduo'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='coreduo-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='n270'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='n270-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='phenom'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='phenom-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <memoryBacking supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <enum name='sourceType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>file</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>anonymous</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>memfd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </memoryBacking>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <disk supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='diskDevice'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>disk</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>cdrom</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>floppy</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>lun</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='bus'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>fdc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>scsi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>sata</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-non-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <graphics supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vnc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>egl-headless</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dbus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </graphics>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <video supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='modelType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vga</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>cirrus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>none</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>bochs</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ramfb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <hostdev supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='mode'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>subsystem</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='startupPolicy'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>default</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>mandatory</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>requisite</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>optional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='subsysType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pci</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>scsi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='capsType'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='pciBackend'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </hostdev>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <rng supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-non-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>random</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>egd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>builtin</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <filesystem supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='driverType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>path</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>handle</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtiofs</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </filesystem>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <tpm supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tpm-tis</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tpm-crb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>emulator</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>external</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendVersion'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>2.0</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </tpm>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <redirdev supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='bus'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </redirdev>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <channel supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pty</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>unix</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </channel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <crypto supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>qemu</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>builtin</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </crypto>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <interface supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>default</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>passt</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <panic supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>isa</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>hyperv</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </panic>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <console supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>null</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pty</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dev</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>file</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pipe</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>stdio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>udp</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tcp</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>unix</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>qemu-vdagent</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dbus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </console>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <gic supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <vmcoreinfo supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <genid supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <backingStoreInput supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <backup supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <async-teardown supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <ps2 supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <sev supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <sgx supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <hyperv supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='features'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>relaxed</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vapic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>spinlocks</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vpindex</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>runtime</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>synic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>stimer</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>reset</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vendor_id</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>frequencies</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>reenlightenment</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tlbflush</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ipi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>avic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>emsr_bitmap</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>xmm_input</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <defaults>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <spinlocks>4095</spinlocks>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <stimer_direct>on</stimer_direct>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </defaults>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </hyperv>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <launchSecurity supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='sectype'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tdx</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </launchSecurity>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: </domainCapabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.166 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.170 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Nov 28 11:11:08 np0005538960 nova_compute[187252]: <domainCapabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <domain>kvm</domain>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <machine>pc-i440fx-rhel7.6.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <arch>x86_64</arch>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <vcpu max='240'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <iothreads supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <os supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <enum name='firmware'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <loader supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>rom</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pflash</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='readonly'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>yes</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>no</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='secure'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>no</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </loader>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='host-passthrough' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='hostPassthroughMigratable'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>on</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>off</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='maximum' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='maximumMigratable'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>on</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>off</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='host-model' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <vendor>AMD</vendor>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='x2apic'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='hypervisor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='stibp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='overflow-recov'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='succor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='lbrv'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc-scale'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='flushbyasid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='pause-filter'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='pfthreshold'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='disable' name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='custom' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Dhyana-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Genoa'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='auto-ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='auto-ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-128'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-256'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-512'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v6'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v7'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='KnightsMill'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512er'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512pf'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='KnightsMill-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512er'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512pf'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G4-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tbm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G5-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tbm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SierraForest'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cmpccxadd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SierraForest-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cmpccxadd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='athlon'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='athlon-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='core2duo'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='core2duo-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='coreduo'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='coreduo-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='n270'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='n270-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='phenom'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='phenom-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <memoryBacking supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <enum name='sourceType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>file</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>anonymous</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>memfd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </memoryBacking>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <disk supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='diskDevice'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>disk</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>cdrom</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>floppy</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>lun</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='bus'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ide</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>fdc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>scsi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>sata</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-non-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <graphics supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vnc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>egl-headless</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dbus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </graphics>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <video supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='modelType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vga</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>cirrus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>none</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>bochs</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ramfb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <hostdev supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='mode'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>subsystem</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='startupPolicy'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>default</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>mandatory</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>requisite</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>optional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='subsysType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pci</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>scsi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='capsType'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='pciBackend'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </hostdev>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <rng supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-non-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>random</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>egd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>builtin</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <filesystem supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='driverType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>path</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>handle</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtiofs</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </filesystem>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <tpm supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tpm-tis</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tpm-crb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>emulator</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>external</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendVersion'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>2.0</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </tpm>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <redirdev supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='bus'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </redirdev>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <channel supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pty</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>unix</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </channel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <crypto supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>qemu</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>builtin</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </crypto>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <interface supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>default</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>passt</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <panic supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>isa</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>hyperv</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </panic>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <console supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>null</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pty</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dev</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>file</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pipe</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>stdio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>udp</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tcp</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>unix</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>qemu-vdagent</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dbus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </console>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <gic supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <vmcoreinfo supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <genid supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <backingStoreInput supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <backup supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <async-teardown supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <ps2 supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <sev supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <sgx supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <hyperv supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='features'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>relaxed</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vapic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>spinlocks</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vpindex</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>runtime</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>synic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>stimer</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>reset</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vendor_id</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>frequencies</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>reenlightenment</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tlbflush</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ipi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>avic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>emsr_bitmap</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>xmm_input</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <defaults>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <spinlocks>4095</spinlocks>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <stimer_direct>on</stimer_direct>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </defaults>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </hyperv>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <launchSecurity supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='sectype'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tdx</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </launchSecurity>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: </domainCapabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.227 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Nov 28 11:11:08 np0005538960 nova_compute[187252]: <domainCapabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <path>/usr/libexec/qemu-kvm</path>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <domain>kvm</domain>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <machine>pc-q35-rhel9.8.0</machine>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <arch>x86_64</arch>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <vcpu max='4096'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <iothreads supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <os supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <enum name='firmware'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>efi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <loader supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>rom</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pflash</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='readonly'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>yes</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>no</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='secure'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>yes</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>no</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </loader>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='host-passthrough' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='hostPassthroughMigratable'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>on</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>off</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='maximum' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='maximumMigratable'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>on</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>off</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='host-model' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model fallback='forbid'>EPYC-Rome</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <vendor>AMD</vendor>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <maxphysaddr mode='passthrough' limit='40'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='x2apic'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc-deadline'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='hypervisor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc_adjust'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='spec-ctrl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='stibp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='cmp_legacy'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='overflow-recov'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='succor'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='amd-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='virt-ssbd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='lbrv'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='tsc-scale'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='vmcb-clean'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='flushbyasid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='pause-filter'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='pfthreshold'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='svme-addr-chk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='require' name='lfence-always-serializing'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <feature policy='disable' name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <mode name='custom' supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Broadwell-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cascadelake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Cooperlake-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Denverton-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Dhyana-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Genoa'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='auto-ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Genoa-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='auto-ibrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Milan-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amd-psfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='no-nested-data-bp'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='null-sel-clr-base'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='stibp-always-on'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-Rome-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='EPYC-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='GraniteRapids-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-128'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-256'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx10-512'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='prefetchiti'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Haswell-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-noTSX'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v6'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Icelake-Server-v7'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='IvyBridge-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='KnightsMill'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512er'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512pf'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='KnightsMill-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4fmaps'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-4vnniw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512er'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512pf'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G4-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tbm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Opteron_G5-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fma4'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tbm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xop'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SapphireRapids-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='amx-tile'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-bf16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-fp16'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512-vpopcntdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bitalg'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vbmi2'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrc'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fzrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='la57'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='taa-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='tsx-ldtrk'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xfd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SierraForest'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cmpccxadd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='SierraForest-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ifma'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-ne-convert'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx-vnni-int8'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='bus-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cmpccxadd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fbsdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='fsrs'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ibrs-all'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mcdt-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pbrsb-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='psdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='sbdr-ssdp-no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='serialize'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vaes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='vpclmulqdq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Client-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='hle'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='rtm'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Skylake-Server-v5'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512bw'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512cd'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512dq'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512f'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='avx512vl'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='invpcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pcid'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='pku'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='mpx'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v2'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v3'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='core-capability'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='split-lock-detect'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='Snowridge-v4'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='cldemote'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='erms'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='gfni'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdir64b'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='movdiri'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='xsaves'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='athlon'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='athlon-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='core2duo'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='core2duo-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='coreduo'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='coreduo-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='n270'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='n270-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='ss'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='phenom'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <blockers model='phenom-v1'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnow'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <feature name='3dnowext'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </blockers>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </mode>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <memoryBacking supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <enum name='sourceType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>file</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>anonymous</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <value>memfd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </memoryBacking>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <disk supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='diskDevice'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>disk</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>cdrom</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>floppy</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>lun</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='bus'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>fdc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>scsi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>sata</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-non-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <graphics supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vnc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>egl-headless</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dbus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </graphics>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <video supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='modelType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vga</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>cirrus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>none</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>bochs</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ramfb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <hostdev supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='mode'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>subsystem</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='startupPolicy'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>default</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>mandatory</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>requisite</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>optional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='subsysType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pci</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>scsi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='capsType'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='pciBackend'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </hostdev>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <rng supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtio-non-transitional</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>random</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>egd</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>builtin</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <filesystem supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='driverType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>path</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>handle</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>virtiofs</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </filesystem>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <tpm supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tpm-tis</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tpm-crb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>emulator</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>external</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendVersion'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>2.0</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </tpm>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <redirdev supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='bus'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>usb</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </redirdev>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <channel supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pty</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>unix</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </channel>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <crypto supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>qemu</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendModel'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>builtin</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </crypto>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <interface supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='backendType'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>default</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>passt</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <panic supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='model'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>isa</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>hyperv</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </panic>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <console supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='type'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>null</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vc</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pty</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dev</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>file</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>pipe</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>stdio</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>udp</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tcp</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>unix</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>qemu-vdagent</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>dbus</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </console>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <gic supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <vmcoreinfo supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <genid supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <backingStoreInput supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <backup supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <async-teardown supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <ps2 supported='yes'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <sev supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <sgx supported='no'/>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <hyperv supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='features'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>relaxed</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vapic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>spinlocks</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vpindex</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>runtime</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>synic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>stimer</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>reset</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>vendor_id</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>frequencies</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>reenlightenment</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tlbflush</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>ipi</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>avic</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>emsr_bitmap</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>xmm_input</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <defaults>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <spinlocks>4095</spinlocks>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <stimer_direct>on</stimer_direct>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <tlbflush_direct>on</tlbflush_direct>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <tlbflush_extended>on</tlbflush_extended>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <vendor_id>Linux KVM Hv</vendor_id>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </defaults>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </hyperv>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    <launchSecurity supported='yes'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      <enum name='sectype'>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:        <value>tdx</value>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:      </enum>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:    </launchSecurity>
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: </domainCapabilities>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.296 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.297 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.297 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.297 187256 INFO nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Secure Boot support detected#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.300 187256 INFO nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.300 187256 INFO nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.313 187256 DEBUG nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] cpu compare xml: <cpu match="exact">
Nov 28 11:11:08 np0005538960 nova_compute[187252]:  <model>Nehalem</model>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: </cpu>
Nov 28 11:11:08 np0005538960 nova_compute[187252]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.317 187256 DEBUG nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.355 187256 INFO nova.virt.node [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Determined node identity 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from /var/lib/nova/compute_id#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.392 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Verified node 65f0ce30-d9ca-4c16-b536-acd92f5f41ce matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.441 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.583 187256 DEBUG oslo_concurrency.lockutils [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.583 187256 DEBUG oslo_concurrency.lockutils [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.583 187256 DEBUG oslo_concurrency.lockutils [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.584 187256 DEBUG nova.compute.resource_tracker [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.723 187256 WARNING nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.724 187256 DEBUG nova.compute.resource_tracker [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6193MB free_disk=73.54620742797852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.725 187256 DEBUG oslo_concurrency.lockutils [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:11:08 np0005538960 nova_compute[187252]: 2025-11-28 16:11:08.725 187256 DEBUG oslo_concurrency.lockutils [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.022 187256 DEBUG nova.compute.resource_tracker [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.023 187256 DEBUG nova.compute.resource_tracker [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.047 187256 DEBUG nova.scheduler.client.report [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.157 187256 DEBUG nova.scheduler.client.report [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.158 187256 DEBUG nova.compute.provider_tree [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.241 187256 DEBUG nova.scheduler.client.report [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.264 187256 DEBUG nova.scheduler.client.report [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.295 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Nov 28 11:11:09 np0005538960 nova_compute[187252]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.295 187256 INFO nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] kernel doesn't support AMD SEV#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.295 187256 DEBUG nova.compute.provider_tree [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.296 187256 DEBUG nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.297 187256 DEBUG nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Libvirt baseline CPU <cpu>
Nov 28 11:11:09 np0005538960 nova_compute[187252]:  <arch>x86_64</arch>
Nov 28 11:11:09 np0005538960 nova_compute[187252]:  <model>Nehalem</model>
Nov 28 11:11:09 np0005538960 nova_compute[187252]:  <vendor>AMD</vendor>
Nov 28 11:11:09 np0005538960 nova_compute[187252]:  <topology sockets="8" cores="1" threads="1"/>
Nov 28 11:11:09 np0005538960 nova_compute[187252]: </cpu>
Nov 28 11:11:09 np0005538960 nova_compute[187252]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.314 187256 DEBUG nova.scheduler.client.report [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.345 187256 DEBUG nova.compute.resource_tracker [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.345 187256 DEBUG oslo_concurrency.lockutils [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.346 187256 DEBUG nova.service [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.398 187256 DEBUG nova.service [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Nov 28 11:11:09 np0005538960 nova_compute[187252]: 2025-11-28 16:11:09.399 187256 DEBUG nova.servicegroup.drivers.db [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Nov 28 11:11:12 np0005538960 systemd-logind[788]: New session 27 of user zuul.
Nov 28 11:11:12 np0005538960 systemd[1]: Started Session 27 of User zuul.
Nov 28 11:11:13 np0005538960 podman[187606]: 2025-11-28 16:11:13.163405825 +0000 UTC m=+0.064624637 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 11:11:13 np0005538960 python3.9[187751]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Nov 28 11:11:15 np0005538960 podman[187837]: 2025-11-28 16:11:15.198318069 +0000 UTC m=+0.102078284 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 11:11:15 np0005538960 python3.9[187927]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:11:15 np0005538960 systemd[1]: Reloading.
Nov 28 11:11:15 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:11:15 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:11:17 np0005538960 python3.9[188113]: ansible-ansible.builtin.service_facts Invoked
Nov 28 11:11:17 np0005538960 network[188130]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Nov 28 11:11:17 np0005538960 network[188131]: 'network-scripts' will be removed from distribution in near future.
Nov 28 11:11:17 np0005538960 network[188132]: It is advised to switch to 'NetworkManager' instead for network management.
Nov 28 11:11:23 np0005538960 python3.9[188406]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:11:25 np0005538960 python3.9[188559]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:11:25 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 11:11:25 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 11:11:26 np0005538960 python3.9[188714]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:11:27 np0005538960 python3.9[188866]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:11:28 np0005538960 python3.9[189018]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 11:11:29 np0005538960 python3.9[189170]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:11:29 np0005538960 systemd[1]: Reloading.
Nov 28 11:11:29 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:11:29 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:11:30 np0005538960 python3.9[189358]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:11:31 np0005538960 python3.9[189511]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:11:32 np0005538960 python3.9[189661]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:11:33 np0005538960 python3.9[189813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:11:33 np0005538960 python3.9[189934]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346292.6081147-360-55870088535245/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=a9bdb897f3979025d9a372b4beff53a09cbe0d55 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:11:34 np0005538960 python3.9[190086]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Nov 28 11:11:36 np0005538960 python3.9[190238]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Nov 28 11:11:39 np0005538960 podman[190264]: 2025-11-28 16:11:39.278288907 +0000 UTC m=+0.166650700 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 11:11:43 np0005538960 python3.9[190418]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Nov 28 11:11:43 np0005538960 podman[190419]: 2025-11-28 16:11:43.502678108 +0000 UTC m=+0.061382585 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 11:11:45 np0005538960 podman[190568]: 2025-11-28 16:11:45.330279333 +0000 UTC m=+0.068505396 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 28 11:11:45 np0005538960 python3.9[190613]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Nov 28 11:11:49 np0005538960 python3.9[190771]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:11:49 np0005538960 python3.9[190892]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764346308.6037815-564-214836257030408/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:11:50 np0005538960 python3.9[191042]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:11:50 np0005538960 python3.9[191163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764346309.9818132-564-87392165487010/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:11:51 np0005538960 python3.9[191313]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:11:52 np0005538960 python3.9[191434]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1764346311.1910741-564-4928341149415/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:11:53 np0005538960 python3.9[191584]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:11:54 np0005538960 python3.9[191736]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:11:54 np0005538960 python3.9[191888]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:11:55 np0005538960 python3.9[192009]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346314.4562593-741-46974336119682/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:11:56 np0005538960 python3.9[192159]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:11:56 np0005538960 python3.9[192235]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:11:57 np0005538960 python3.9[192385]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:11:58 np0005538960 python3.9[192506]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346316.9734683-741-116456583394774/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:11:59 np0005538960 python3.9[192656]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:11:59 np0005538960 python3.9[192777]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346318.5165045-741-197788956278625/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:00 np0005538960 nova_compute[187252]: 2025-11-28 16:12:00.402 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:00 np0005538960 nova_compute[187252]: 2025-11-28 16:12:00.419 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:00 np0005538960 python3.9[192927]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:01 np0005538960 python3.9[193048]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346319.970797-741-102790281882259/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:01 np0005538960 python3.9[193198]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:02 np0005538960 python3.9[193319]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346321.3689384-741-63949678109381/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:03 np0005538960 python3.9[193469]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:03 np0005538960 python3.9[193590]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346322.7308457-741-49566896136131/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:04 np0005538960 python3.9[193740]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:05 np0005538960 python3.9[193861]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346324.0372024-741-181194667576964/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:05 np0005538960 python3.9[194011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:12:06.330 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:12:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:12:06.331 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:12:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:12:06.331 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:12:06 np0005538960 python3.9[194132]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346325.4080858-741-260666216681086/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:07 np0005538960 python3.9[194282]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.318 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.319 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.319 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.319 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.330 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.330 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.330 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.332 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.332 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.332 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.350 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.350 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.351 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.351 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.529 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.529 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6142MB free_disk=73.5459098815918GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.530 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.530 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.596 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.597 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.626 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.647 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.650 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:12:07 np0005538960 nova_compute[187252]: 2025-11-28 16:12:07.650 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:12:07 np0005538960 python3.9[194403]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346326.7314644-741-221568535556357/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:08 np0005538960 python3.9[194553]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:09 np0005538960 python3.9[194674]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346328.015884-741-93252121224740/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:10 np0005538960 podman[194699]: 2025-11-28 16:12:10.253612272 +0000 UTC m=+0.153297729 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 11:12:10 np0005538960 python3.9[194850]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:11 np0005538960 python3.9[194926]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:12 np0005538960 python3.9[195076]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:12 np0005538960 python3.9[195152]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:13 np0005538960 python3.9[195302]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:13 np0005538960 podman[195352]: 2025-11-28 16:12:13.839271523 +0000 UTC m=+0.083642656 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:12:13 np0005538960 python3.9[195397]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:14 np0005538960 python3.9[195549]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:15 np0005538960 podman[195701]: 2025-11-28 16:12:15.501361127 +0000 UTC m=+0.090914093 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd)
Nov 28 11:12:15 np0005538960 python3.9[195702]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:16 np0005538960 python3.9[195872]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:12:17 np0005538960 python3.9[196024]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:12:17 np0005538960 systemd[1]: Reloading.
Nov 28 11:12:17 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:12:17 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:12:17 np0005538960 systemd[1]: Listening on Podman API Socket.
Nov 28 11:12:19 np0005538960 python3.9[196215]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:19 np0005538960 python3.9[196339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346338.6027102-1407-165754391422219/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:12:20 np0005538960 python3.9[196415]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:20 np0005538960 python3.9[196538]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346338.6027102-1407-165754391422219/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:12:22 np0005538960 python3.9[196690]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Nov 28 11:12:23 np0005538960 python3.9[196842]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:12:24 np0005538960 python3[196994]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:12:25 np0005538960 podman[197032]: 2025-11-28 16:12:25.27301546 +0000 UTC m=+0.061245449 container create 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 11:12:25 np0005538960 podman[197032]: 2025-11-28 16:12:25.237558416 +0000 UTC m=+0.025788395 image pull e6f07353639e492d8c9627d6d615ceeb47cb00ac4d14993b12e8023ee2aeee6f quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Nov 28 11:12:25 np0005538960 python3[196994]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Nov 28 11:12:26 np0005538960 python3.9[197220]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:12:27 np0005538960 python3.9[197374]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:28 np0005538960 python3.9[197527]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764346347.6632485-1599-135609566207241/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:29 np0005538960 python3.9[197603]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:12:29 np0005538960 systemd[1]: Reloading.
Nov 28 11:12:29 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:12:29 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:12:30 np0005538960 python3.9[197714]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:12:30 np0005538960 systemd[1]: Reloading.
Nov 28 11:12:30 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:12:30 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:12:30 np0005538960 systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 11:12:30 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:12:30 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:30 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:30 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:30 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:30 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.
Nov 28 11:12:30 np0005538960 podman[197753]: 2025-11-28 16:12:30.854254505 +0000 UTC m=+0.136066915 container init 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + sudo -E kolla_set_configs
Nov 28 11:12:30 np0005538960 podman[197753]: 2025-11-28 16:12:30.87773334 +0000 UTC m=+0.159545740 container start 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: sudo: unable to send audit message: Operation not permitted
Nov 28 11:12:30 np0005538960 podman[197753]: ceilometer_agent_compute
Nov 28 11:12:30 np0005538960 systemd[1]: Started ceilometer_agent_compute container.
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Validating config file
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Copying service configuration files
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: INFO:__main__:Writing out command to execute
Nov 28 11:12:30 np0005538960 podman[197776]: 2025-11-28 16:12:30.948297387 +0000 UTC m=+0.056552097 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 28 11:12:30 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-63edebd9fb7fb5d1.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:12:30 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-63edebd9fb7fb5d1.service: Failed with result 'exit-code'.
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: ++ cat /run_command
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + ARGS=
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + sudo kolla_copy_cacerts
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: sudo: unable to send audit message: Operation not permitted
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + [[ ! -n '' ]]
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + . kolla_extend_start
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + umask 0022
Nov 28 11:12:30 np0005538960 ceilometer_agent_compute[197769]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.857 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.858 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.858 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.858 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.858 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.858 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.858 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.859 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.859 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.859 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.859 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.859 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.859 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.859 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.860 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.860 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.860 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.860 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.860 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.860 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.860 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.860 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.861 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.861 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.861 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.861 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.861 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.861 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.861 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.861 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.862 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.863 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.863 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.863 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.863 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.863 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.863 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.863 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.863 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.864 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.864 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.864 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.864 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.864 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.864 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.864 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.864 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.865 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.865 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.865 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.865 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.865 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.865 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.865 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.865 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.866 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.866 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.866 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.866 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.866 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.866 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.866 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.867 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.867 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.867 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.867 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.867 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.867 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.867 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.868 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.869 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.870 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.870 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.870 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.870 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.870 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.871 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.871 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.871 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.871 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.871 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.871 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.872 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.872 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.872 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.872 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.872 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.872 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.872 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.873 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.873 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.873 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.874 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.875 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.876 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.876 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.876 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.876 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.876 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.876 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.876 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.877 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.877 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.877 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.877 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.877 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.877 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.878 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.878 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.878 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.878 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.878 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.878 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.878 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.878 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.879 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.879 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.879 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.879 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.879 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.899 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.901 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 28 11:12:31 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.902 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:31.999 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.074 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.075 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.075 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.075 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.075 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.075 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.075 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.075 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.076 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.076 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.076 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.076 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.076 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.076 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.076 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.076 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.077 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.078 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.079 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.080 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.081 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.082 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.083 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.084 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.085 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.086 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.087 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.088 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.089 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.090 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.091 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.092 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.093 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.094 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.095 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.095 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.095 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.095 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.095 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.095 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.095 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.098 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.106 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.111 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.112 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.113 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.114 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.115 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.116 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:32 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:32.117 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:33 np0005538960 python3.9[197957]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:12:33 np0005538960 systemd[1]: Stopping ceilometer_agent_compute container...
Nov 28 11:12:33 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:33.383 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Nov 28 11:12:33 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:33.485 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Nov 28 11:12:33 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:33.486 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Nov 28 11:12:33 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:33.486 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Nov 28 11:12:33 np0005538960 virtqemud[186797]: End of file while reading data: Input/output error
Nov 28 11:12:33 np0005538960 virtqemud[186797]: End of file while reading data: Input/output error
Nov 28 11:12:33 np0005538960 ceilometer_agent_compute[197769]: 2025-11-28 16:12:33.502 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Nov 28 11:12:33 np0005538960 systemd[1]: libpod-1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.scope: Deactivated successfully.
Nov 28 11:12:33 np0005538960 systemd[1]: libpod-1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.scope: Consumed 1.472s CPU time.
Nov 28 11:12:33 np0005538960 podman[197961]: 2025-11-28 16:12:33.712279249 +0000 UTC m=+0.390181190 container died 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:12:33 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-63edebd9fb7fb5d1.timer: Deactivated successfully.
Nov 28 11:12:33 np0005538960 systemd[1]: Stopped /usr/bin/podman healthcheck run 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.
Nov 28 11:12:33 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-userdata-shm.mount: Deactivated successfully.
Nov 28 11:12:33 np0005538960 systemd[1]: var-lib-containers-storage-overlay-f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0-merged.mount: Deactivated successfully.
Nov 28 11:12:33 np0005538960 podman[197961]: 2025-11-28 16:12:33.778925715 +0000 UTC m=+0.456827656 container cleanup 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 11:12:33 np0005538960 podman[197961]: ceilometer_agent_compute
Nov 28 11:12:33 np0005538960 podman[197991]: ceilometer_agent_compute
Nov 28 11:12:33 np0005538960 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Nov 28 11:12:33 np0005538960 systemd[1]: Stopped ceilometer_agent_compute container.
Nov 28 11:12:33 np0005538960 systemd[1]: Starting ceilometer_agent_compute container...
Nov 28 11:12:34 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:12:34 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:34 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:34 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:34 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1a8a344876fd6fefe2dd358cd5a4b1d66aefe1c3766f4b21482479f62225ab0/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:34 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.
Nov 28 11:12:34 np0005538960 podman[198004]: 2025-11-28 16:12:34.071544391 +0000 UTC m=+0.146903854 container init 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + sudo -E kolla_set_configs
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: sudo: unable to send audit message: Operation not permitted
Nov 28 11:12:34 np0005538960 podman[198004]: 2025-11-28 16:12:34.10182209 +0000 UTC m=+0.177181523 container start 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 28 11:12:34 np0005538960 podman[198004]: ceilometer_agent_compute
Nov 28 11:12:34 np0005538960 systemd[1]: Started ceilometer_agent_compute container.
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Validating config file
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Copying service configuration files
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: INFO:__main__:Writing out command to execute
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: ++ cat /run_command
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + ARGS=
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + sudo kolla_copy_cacerts
Nov 28 11:12:34 np0005538960 podman[198026]: 2025-11-28 16:12:34.191687716 +0000 UTC m=+0.077499328 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 11:12:34 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:12:34 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Failed with result 'exit-code'.
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: sudo: unable to send audit message: Operation not permitted
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + [[ ! -n '' ]]
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + . kolla_extend_start
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + umask 0022
Nov 28 11:12:34 np0005538960 ceilometer_agent_compute[198019]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.088 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.088 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.088 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.088 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.089 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.090 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.091 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.092 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.093 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.094 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.095 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.096 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.097 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.098 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.098 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.098 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.098 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.098 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.098 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.098 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.098 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.099 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.100 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.101 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.102 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.103 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.123 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.124 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.125 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.136 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.266 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.266 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.266 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.266 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.267 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.268 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.268 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.268 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.268 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.268 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.268 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.268 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.268 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.269 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.270 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.271 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.272 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.273 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.274 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.275 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.276 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.277 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.278 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.279 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.279 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.279 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.279 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.279 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.279 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.279 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.279 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.280 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.281 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.282 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.283 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.284 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.285 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.286 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.287 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.287 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.287 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.287 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.287 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.287 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.292 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.300 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:12:35.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:12:36 np0005538960 python3.9[198207]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:37 np0005538960 python3.9[198332]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346355.3622084-1695-75549024819028/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:12:39 np0005538960 python3.9[198484]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Nov 28 11:12:40 np0005538960 python3.9[198636]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:12:40 np0005538960 podman[198760]: 2025-11-28 16:12:40.882716012 +0000 UTC m=+0.083521972 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 11:12:41 np0005538960 python3[198807]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:12:41 np0005538960 podman[198852]: 2025-11-28 16:12:41.334421265 +0000 UTC m=+0.057631515 container create 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible)
Nov 28 11:12:41 np0005538960 podman[198852]: 2025-11-28 16:12:41.302425881 +0000 UTC m=+0.025636211 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Nov 28 11:12:41 np0005538960 python3[198807]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Nov 28 11:12:42 np0005538960 python3.9[199042]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:12:43 np0005538960 python3.9[199196]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:44 np0005538960 podman[199267]: 2025-11-28 16:12:44.166817439 +0000 UTC m=+0.058740763 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:12:44 np0005538960 python3.9[199367]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764346363.8667264-1854-262810838584752/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:12:45 np0005538960 python3.9[199443]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:12:45 np0005538960 systemd[1]: Reloading.
Nov 28 11:12:45 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:12:45 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:12:45 np0005538960 auditd[705]: Audit daemon rotating log files
Nov 28 11:12:45 np0005538960 podman[199526]: 2025-11-28 16:12:45.805992884 +0000 UTC m=+0.079067298 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, managed_by=edpm_ansible)
Nov 28 11:12:46 np0005538960 python3.9[199574]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:12:46 np0005538960 systemd[1]: Reloading.
Nov 28 11:12:46 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:12:46 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:12:46 np0005538960 systemd[1]: Starting node_exporter container...
Nov 28 11:12:46 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:12:46 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a03713a8780b754827871575ad6f01a691ebb5fa169554943e9c432e968fd3a3/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:46 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a03713a8780b754827871575ad6f01a691ebb5fa169554943e9c432e968fd3a3/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:46 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8.
Nov 28 11:12:46 np0005538960 podman[199614]: 2025-11-28 16:12:46.594278904 +0000 UTC m=+0.130472681 container init 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.610Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.610Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.610Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.611Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.611Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.612Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.612Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.612Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.612Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.612Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.612Z caller=node_exporter.go:117 level=info collector=arp
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=bcache
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=bonding
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=cpu
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=edac
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=filefd
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=netclass
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=netdev
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=netstat
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=nfs
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=nvme
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=softnet
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=systemd
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=xfs
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.613Z caller=node_exporter.go:117 level=info collector=zfs
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.614Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 28 11:12:46 np0005538960 node_exporter[199630]: ts=2025-11-28T16:12:46.614Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 28 11:12:46 np0005538960 podman[199614]: 2025-11-28 16:12:46.621958407 +0000 UTC m=+0.158152184 container start 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:12:46 np0005538960 podman[199614]: node_exporter
Nov 28 11:12:46 np0005538960 systemd[1]: Started node_exporter container.
Nov 28 11:12:46 np0005538960 podman[199639]: 2025-11-28 16:12:46.73313073 +0000 UTC m=+0.099130184 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:12:49 np0005538960 python3.9[199814]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:12:49 np0005538960 systemd[1]: Stopping node_exporter container...
Nov 28 11:12:49 np0005538960 systemd[1]: libpod-9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8.scope: Deactivated successfully.
Nov 28 11:12:49 np0005538960 podman[199818]: 2025-11-28 16:12:49.695780169 +0000 UTC m=+0.078365380 container died 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:12:49 np0005538960 systemd[1]: 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8-55a8cdc745d5af4d.timer: Deactivated successfully.
Nov 28 11:12:49 np0005538960 systemd[1]: Stopped /usr/bin/podman healthcheck run 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8.
Nov 28 11:12:49 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8-userdata-shm.mount: Deactivated successfully.
Nov 28 11:12:49 np0005538960 systemd[1]: var-lib-containers-storage-overlay-a03713a8780b754827871575ad6f01a691ebb5fa169554943e9c432e968fd3a3-merged.mount: Deactivated successfully.
Nov 28 11:12:49 np0005538960 podman[199818]: 2025-11-28 16:12:49.742756198 +0000 UTC m=+0.125341419 container cleanup 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:12:49 np0005538960 podman[199818]: node_exporter
Nov 28 11:12:49 np0005538960 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 11:12:49 np0005538960 podman[199844]: node_exporter
Nov 28 11:12:49 np0005538960 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Nov 28 11:12:49 np0005538960 systemd[1]: Stopped node_exporter container.
Nov 28 11:12:49 np0005538960 systemd[1]: Starting node_exporter container...
Nov 28 11:12:49 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:12:49 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a03713a8780b754827871575ad6f01a691ebb5fa169554943e9c432e968fd3a3/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:49 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a03713a8780b754827871575ad6f01a691ebb5fa169554943e9c432e968fd3a3/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 11:12:50 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8.
Nov 28 11:12:50 np0005538960 podman[199858]: 2025-11-28 16:12:50.021150418 +0000 UTC m=+0.149898442 container init 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.041Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.041Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.041Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.042Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.042Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=arp
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=bcache
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=bonding
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=btrfs
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=conntrack
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=cpu
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=cpufreq
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=diskstats
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=edac
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=fibrechannel
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=filefd
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=filesystem
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=infiniband
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=ipvs
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=loadavg
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=mdadm
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.043Z caller=node_exporter.go:117 level=info collector=meminfo
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=netclass
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=netdev
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=netstat
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=nfs
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=nfsd
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=nvme
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=schedstat
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=sockstat
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=softnet
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=systemd
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=tapestats
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=udp_queues
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=vmstat
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=xfs
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=node_exporter.go:117 level=info collector=zfs
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.044Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Nov 28 11:12:50 np0005538960 node_exporter[199874]: ts=2025-11-28T16:12:50.045Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Nov 28 11:12:50 np0005538960 podman[199858]: 2025-11-28 16:12:50.058952431 +0000 UTC m=+0.187700435 container start 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:12:50 np0005538960 podman[199858]: node_exporter
Nov 28 11:12:50 np0005538960 systemd[1]: Started node_exporter container.
Nov 28 11:12:50 np0005538960 podman[199883]: 2025-11-28 16:12:50.14200918 +0000 UTC m=+0.069123491 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:12:51 np0005538960 python3.9[200061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:12:52 np0005538960 python3.9[200185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346370.7246647-1950-4494036462617/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:12:53 np0005538960 python3.9[200337]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Nov 28 11:12:54 np0005538960 python3.9[200489]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:12:55 np0005538960 python3[200641]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:12:57 np0005538960 podman[200656]: 2025-11-28 16:12:57.385183606 +0000 UTC m=+1.577324722 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 28 11:12:57 np0005538960 podman[200753]: 2025-11-28 16:12:57.543553165 +0000 UTC m=+0.054978268 container create 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:12:57 np0005538960 podman[200753]: 2025-11-28 16:12:57.51385222 +0000 UTC m=+0.025277303 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Nov 28 11:12:57 np0005538960 python3[200641]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Nov 28 11:12:59 np0005538960 python3.9[200942]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:13:00 np0005538960 python3.9[201096]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:13:01 np0005538960 python3.9[201247]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764346380.386199-2109-26506304954595/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:13:01 np0005538960 python3.9[201323]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:13:01 np0005538960 systemd[1]: Reloading.
Nov 28 11:13:02 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:13:02 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:13:03 np0005538960 python3.9[201435]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:13:03 np0005538960 systemd[1]: Reloading.
Nov 28 11:13:03 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:13:03 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:13:03 np0005538960 systemd[1]: Starting podman_exporter container...
Nov 28 11:13:03 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:13:03 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b0280236c998b26f0065453300db156670fd215afae180b18b1fdaf479cd844/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:03 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b0280236c998b26f0065453300db156670fd215afae180b18b1fdaf479cd844/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:03 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee.
Nov 28 11:13:03 np0005538960 podman[201478]: 2025-11-28 16:13:03.790187246 +0000 UTC m=+0.160661909 container init 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:13:03 np0005538960 podman_exporter[201493]: ts=2025-11-28T16:13:03.808Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 11:13:03 np0005538960 podman_exporter[201493]: ts=2025-11-28T16:13:03.808Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 11:13:03 np0005538960 podman_exporter[201493]: ts=2025-11-28T16:13:03.808Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 11:13:03 np0005538960 podman_exporter[201493]: ts=2025-11-28T16:13:03.808Z caller=handler.go:105 level=info collector=container
Nov 28 11:13:03 np0005538960 podman[201478]: 2025-11-28 16:13:03.81944501 +0000 UTC m=+0.189919673 container start 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:13:03 np0005538960 podman[201478]: podman_exporter
Nov 28 11:13:03 np0005538960 systemd[1]: Starting Podman API Service...
Nov 28 11:13:03 np0005538960 systemd[1]: Started Podman API Service.
Nov 28 11:13:03 np0005538960 systemd[1]: Started podman_exporter container.
Nov 28 11:13:03 np0005538960 podman[201504]: time="2025-11-28T16:13:03Z" level=info msg="/usr/bin/podman filtering at log level info"
Nov 28 11:13:03 np0005538960 podman[201504]: time="2025-11-28T16:13:03Z" level=info msg="Setting parallel job count to 25"
Nov 28 11:13:03 np0005538960 podman[201504]: time="2025-11-28T16:13:03Z" level=info msg="Using sqlite as database backend"
Nov 28 11:13:03 np0005538960 podman[201504]: time="2025-11-28T16:13:03Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Nov 28 11:13:03 np0005538960 podman[201504]: time="2025-11-28T16:13:03Z" level=info msg="Using systemd socket activation to determine API endpoint"
Nov 28 11:13:03 np0005538960 podman[201504]: time="2025-11-28T16:13:03Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Nov 28 11:13:03 np0005538960 podman[201504]: @ - - [28/Nov/2025:16:13:03 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 11:13:03 np0005538960 podman[201504]: time="2025-11-28T16:13:03Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 11:13:03 np0005538960 podman[201502]: 2025-11-28 16:13:03.892593943 +0000 UTC m=+0.059933923 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:13:03 np0005538960 systemd[1]: 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee-2d1e4b22e4a9034.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:13:03 np0005538960 systemd[1]: 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee-2d1e4b22e4a9034.service: Failed with result 'exit-code'.
Nov 28 11:13:03 np0005538960 podman[201504]: @ - - [28/Nov/2025:16:13:03 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19569 "" "Go-http-client/1.1"
Nov 28 11:13:03 np0005538960 podman_exporter[201493]: ts=2025-11-28T16:13:03.907Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 28 11:13:03 np0005538960 podman_exporter[201493]: ts=2025-11-28T16:13:03.908Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 28 11:13:03 np0005538960 podman_exporter[201493]: ts=2025-11-28T16:13:03.909Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 28 11:13:04 np0005538960 podman[201665]: 2025-11-28 16:13:04.904970285 +0000 UTC m=+0.060611151 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:13:04 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:13:04 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Failed with result 'exit-code'.
Nov 28 11:13:05 np0005538960 python3.9[201711]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:13:05 np0005538960 systemd[1]: Stopping podman_exporter container...
Nov 28 11:13:05 np0005538960 podman[201504]: @ - - [28/Nov/2025:16:13:03 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3760 "" "Go-http-client/1.1"
Nov 28 11:13:05 np0005538960 systemd[1]: libpod-7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee.scope: Deactivated successfully.
Nov 28 11:13:05 np0005538960 podman[201715]: 2025-11-28 16:13:05.329493879 +0000 UTC m=+0.060979482 container died 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:13:05 np0005538960 systemd[1]: 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee-2d1e4b22e4a9034.timer: Deactivated successfully.
Nov 28 11:13:05 np0005538960 systemd[1]: Stopped /usr/bin/podman healthcheck run 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee.
Nov 28 11:13:05 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee-userdata-shm.mount: Deactivated successfully.
Nov 28 11:13:05 np0005538960 systemd[1]: var-lib-containers-storage-overlay-4b0280236c998b26f0065453300db156670fd215afae180b18b1fdaf479cd844-merged.mount: Deactivated successfully.
Nov 28 11:13:05 np0005538960 podman[201715]: 2025-11-28 16:13:05.794678629 +0000 UTC m=+0.526164162 container cleanup 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:13:05 np0005538960 podman[201715]: podman_exporter
Nov 28 11:13:05 np0005538960 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 11:13:05 np0005538960 podman[201743]: podman_exporter
Nov 28 11:13:05 np0005538960 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Nov 28 11:13:05 np0005538960 systemd[1]: Stopped podman_exporter container.
Nov 28 11:13:05 np0005538960 systemd[1]: Starting podman_exporter container...
Nov 28 11:13:06 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:13:06 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b0280236c998b26f0065453300db156670fd215afae180b18b1fdaf479cd844/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:06 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b0280236c998b26f0065453300db156670fd215afae180b18b1fdaf479cd844/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:06 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee.
Nov 28 11:13:06 np0005538960 podman[201756]: 2025-11-28 16:13:06.043459676 +0000 UTC m=+0.127604928 container init 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:13:06 np0005538960 podman_exporter[201771]: ts=2025-11-28T16:13:06.062Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Nov 28 11:13:06 np0005538960 podman_exporter[201771]: ts=2025-11-28T16:13:06.062Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Nov 28 11:13:06 np0005538960 podman_exporter[201771]: ts=2025-11-28T16:13:06.062Z caller=handler.go:94 level=info msg="enabled collectors"
Nov 28 11:13:06 np0005538960 podman_exporter[201771]: ts=2025-11-28T16:13:06.062Z caller=handler.go:105 level=info collector=container
Nov 28 11:13:06 np0005538960 podman[201504]: @ - - [28/Nov/2025:16:13:06 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Nov 28 11:13:06 np0005538960 podman[201504]: time="2025-11-28T16:13:06Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Nov 28 11:13:06 np0005538960 podman[201756]: 2025-11-28 16:13:06.075372183 +0000 UTC m=+0.159517425 container start 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:13:06 np0005538960 podman[201756]: podman_exporter
Nov 28 11:13:06 np0005538960 podman[201504]: @ - - [28/Nov/2025:16:13:06 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19571 "" "Go-http-client/1.1"
Nov 28 11:13:06 np0005538960 podman_exporter[201771]: ts=2025-11-28T16:13:06.083Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Nov 28 11:13:06 np0005538960 podman_exporter[201771]: ts=2025-11-28T16:13:06.084Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Nov 28 11:13:06 np0005538960 podman_exporter[201771]: ts=2025-11-28T16:13:06.084Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Nov 28 11:13:06 np0005538960 systemd[1]: Started podman_exporter container.
Nov 28 11:13:06 np0005538960 podman[201780]: 2025-11-28 16:13:06.151201859 +0000 UTC m=+0.058645648 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:13:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:13:06.331 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:13:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:13:06.331 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:13:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:13:06.332 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:13:06 np0005538960 python3.9[201956]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:13:07 np0005538960 nova_compute[187252]: 2025-11-28 16:13:07.642 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:07 np0005538960 python3.9[202079]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1764346386.4243748-2205-161450022955957/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Nov 28 11:13:07 np0005538960 nova_compute[187252]: 2025-11-28 16:13:07.666 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:07 np0005538960 nova_compute[187252]: 2025-11-28 16:13:07.667 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:08 np0005538960 nova_compute[187252]: 2025-11-28 16:13:08.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:08 np0005538960 nova_compute[187252]: 2025-11-28 16:13:08.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:08 np0005538960 nova_compute[187252]: 2025-11-28 16:13:08.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:08 np0005538960 nova_compute[187252]: 2025-11-28 16:13:08.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:13:08 np0005538960 python3.9[202231]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.329 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.329 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.330 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.330 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.367 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.368 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.368 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.368 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.554 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.555 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6052MB free_disk=73.50997161865234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.555 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.555 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.622 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.622 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.652 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.665 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.667 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:13:09 np0005538960 nova_compute[187252]: 2025-11-28 16:13:09.667 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:13:09 np0005538960 python3.9[202383]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Nov 28 11:13:11 np0005538960 podman[202535]: 2025-11-28 16:13:11.099088894 +0000 UTC m=+0.150918084 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:13:11 np0005538960 python3[202536]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Nov 28 11:13:14 np0005538960 podman[202574]: 2025-11-28 16:13:14.51630845 +0000 UTC m=+3.184329575 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 11:13:14 np0005538960 podman[202671]: 2025-11-28 16:13:14.659867325 +0000 UTC m=+0.034486260 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 11:13:15 np0005538960 podman[202671]: 2025-11-28 16:13:15.172111538 +0000 UTC m=+0.546730423 container create 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, vcs-type=git, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, io.openshift.expose-services=)
Nov 28 11:13:15 np0005538960 python3[202536]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Nov 28 11:13:15 np0005538960 podman[202684]: 2025-11-28 16:13:15.240584275 +0000 UTC m=+0.137363255 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 11:13:16 np0005538960 podman[202851]: 2025-11-28 16:13:16.206228208 +0000 UTC m=+0.105743076 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:13:16 np0005538960 python3.9[202896]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:13:17 np0005538960 python3.9[203053]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:13:18 np0005538960 python3.9[203204]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1764346397.5763958-2364-279366112547358/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:13:19 np0005538960 python3.9[203280]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Nov 28 11:13:19 np0005538960 systemd[1]: Reloading.
Nov 28 11:13:19 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:13:19 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:13:20 np0005538960 python3.9[203392]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Nov 28 11:13:20 np0005538960 systemd[1]: Reloading.
Nov 28 11:13:20 np0005538960 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Nov 28 11:13:20 np0005538960 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Nov 28 11:13:21 np0005538960 systemd[1]: Starting openstack_network_exporter container...
Nov 28 11:13:21 np0005538960 podman[203431]: 2025-11-28 16:13:21.936070563 +0000 UTC m=+0.629800176 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:13:21 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:13:21 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9da4202dba53172c6e08b78a52cce199f8106f55ae4af72502b36acb68b60a/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:21 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9da4202dba53172c6e08b78a52cce199f8106f55ae4af72502b36acb68b60a/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:21 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9da4202dba53172c6e08b78a52cce199f8106f55ae4af72502b36acb68b60a/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:22 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786.
Nov 28 11:13:22 np0005538960 podman[203433]: 2025-11-28 16:13:22.433139406 +0000 UTC m=+1.120108934 container init 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *bridge.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *coverage.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *datapath.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *iface.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *memory.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *ovnnorthd.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *ovn.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *ovsdbserver.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *pmd_perf.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *pmd_rxq.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: INFO    16:13:22 main.go:48: registering *vswitch.Collector
Nov 28 11:13:22 np0005538960 openstack_network_exporter[203472]: NOTICE  16:13:22 main.go:76: listening on https://:9105/metrics
Nov 28 11:13:22 np0005538960 podman[203433]: 2025-11-28 16:13:22.475643032 +0000 UTC m=+1.162612550 container start 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41)
Nov 28 11:13:22 np0005538960 podman[203433]: openstack_network_exporter
Nov 28 11:13:22 np0005538960 systemd[1]: Started openstack_network_exporter container.
Nov 28 11:13:22 np0005538960 podman[203483]: 2025-11-28 16:13:22.581255723 +0000 UTC m=+0.088449915 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, release=1755695350)
Nov 28 11:13:24 np0005538960 python3.9[203659]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Nov 28 11:13:24 np0005538960 systemd[1]: Stopping openstack_network_exporter container...
Nov 28 11:13:24 np0005538960 systemd[1]: libpod-1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786.scope: Deactivated successfully.
Nov 28 11:13:24 np0005538960 podman[203663]: 2025-11-28 16:13:24.220537478 +0000 UTC m=+0.064456150 container died 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, version=9.6, io.buildah.version=1.33.7)
Nov 28 11:13:24 np0005538960 systemd[1]: 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786-28d9d59bab22dc00.timer: Deactivated successfully.
Nov 28 11:13:24 np0005538960 systemd[1]: Stopped /usr/bin/podman healthcheck run 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786.
Nov 28 11:13:24 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786-userdata-shm.mount: Deactivated successfully.
Nov 28 11:13:24 np0005538960 systemd[1]: var-lib-containers-storage-overlay-6a9da4202dba53172c6e08b78a52cce199f8106f55ae4af72502b36acb68b60a-merged.mount: Deactivated successfully.
Nov 28 11:13:25 np0005538960 podman[203663]: 2025-11-28 16:13:25.825119478 +0000 UTC m=+1.669038190 container cleanup 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, release=1755695350, vcs-type=git, version=9.6, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Nov 28 11:13:25 np0005538960 podman[203663]: openstack_network_exporter
Nov 28 11:13:25 np0005538960 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Nov 28 11:13:25 np0005538960 podman[203692]: openstack_network_exporter
Nov 28 11:13:25 np0005538960 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Nov 28 11:13:25 np0005538960 systemd[1]: Stopped openstack_network_exporter container.
Nov 28 11:13:25 np0005538960 systemd[1]: Starting openstack_network_exporter container...
Nov 28 11:13:26 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:13:26 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9da4202dba53172c6e08b78a52cce199f8106f55ae4af72502b36acb68b60a/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:26 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9da4202dba53172c6e08b78a52cce199f8106f55ae4af72502b36acb68b60a/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:26 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9da4202dba53172c6e08b78a52cce199f8106f55ae4af72502b36acb68b60a/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Nov 28 11:13:26 np0005538960 systemd[1]: Started /usr/bin/podman healthcheck run 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786.
Nov 28 11:13:26 np0005538960 podman[203705]: 2025-11-28 16:13:26.079777939 +0000 UTC m=+0.139390816 container init 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, version=9.6, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *bridge.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *coverage.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *datapath.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *iface.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *memory.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *ovnnorthd.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *ovn.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *ovsdbserver.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *pmd_perf.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *pmd_rxq.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: INFO    16:13:26 main.go:48: registering *vswitch.Collector
Nov 28 11:13:26 np0005538960 openstack_network_exporter[203721]: NOTICE  16:13:26 main.go:76: listening on https://:9105/metrics
Nov 28 11:13:26 np0005538960 podman[203705]: 2025-11-28 16:13:26.116135554 +0000 UTC m=+0.175748331 container start 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 11:13:26 np0005538960 podman[203705]: openstack_network_exporter
Nov 28 11:13:26 np0005538960 systemd[1]: Started openstack_network_exporter container.
Nov 28 11:13:26 np0005538960 podman[203732]: 2025-11-28 16:13:26.20179255 +0000 UTC m=+0.070501739 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Nov 28 11:13:26 np0005538960 python3.9[203904]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Nov 28 11:13:35 np0005538960 podman[203929]: 2025-11-28 16:13:35.164451269 +0000 UTC m=+0.068681354 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:13:35 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:13:35 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Failed with result 'exit-code'.
Nov 28 11:13:37 np0005538960 podman[203948]: 2025-11-28 16:13:37.195065571 +0000 UTC m=+0.091197881 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:13:41 np0005538960 podman[203975]: 2025-11-28 16:13:41.615857353 +0000 UTC m=+0.126078071 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:13:46 np0005538960 podman[204001]: 2025-11-28 16:13:46.150009265 +0000 UTC m=+0.055334678 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:13:47 np0005538960 podman[204020]: 2025-11-28 16:13:47.198299709 +0000 UTC m=+0.091069318 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:13:52 np0005538960 podman[204042]: 2025-11-28 16:13:52.184983609 +0000 UTC m=+0.078901892 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:13:57 np0005538960 podman[204066]: 2025-11-28 16:13:57.187262459 +0000 UTC m=+0.087802699 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, name=ubi9-minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc.)
Nov 28 11:14:06 np0005538960 podman[204088]: 2025-11-28 16:14:06.165565512 +0000 UTC m=+0.067504485 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:14:06 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:14:06 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Failed with result 'exit-code'.
Nov 28 11:14:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:14:06.332 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:14:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:14:06.332 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:14:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:14:06.332 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:14:07 np0005538960 nova_compute[187252]: 2025-11-28 16:14:07.653 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:08 np0005538960 podman[204107]: 2025-11-28 16:14:08.158049676 +0000 UTC m=+0.058912205 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:14:08 np0005538960 nova_compute[187252]: 2025-11-28 16:14:08.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:08 np0005538960 nova_compute[187252]: 2025-11-28 16:14:08.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:08 np0005538960 nova_compute[187252]: 2025-11-28 16:14:08.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:08 np0005538960 nova_compute[187252]: 2025-11-28 16:14:08.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:14:09 np0005538960 nova_compute[187252]: 2025-11-28 16:14:09.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:09 np0005538960 nova_compute[187252]: 2025-11-28 16:14:09.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.347 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.348 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.348 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.348 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.502 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.503 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6012MB free_disk=73.37506866455078GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.504 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.504 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.572 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.573 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.608 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.622 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.624 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:14:10 np0005538960 nova_compute[187252]: 2025-11-28 16:14:10.624 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:14:11 np0005538960 nova_compute[187252]: 2025-11-28 16:14:11.625 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:14:11 np0005538960 nova_compute[187252]: 2025-11-28 16:14:11.626 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:14:11 np0005538960 nova_compute[187252]: 2025-11-28 16:14:11.626 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:14:11 np0005538960 nova_compute[187252]: 2025-11-28 16:14:11.640 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:14:12 np0005538960 podman[204134]: 2025-11-28 16:14:12.191174298 +0000 UTC m=+0.100106249 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3)
Nov 28 11:14:17 np0005538960 podman[204162]: 2025-11-28 16:14:17.141940915 +0000 UTC m=+0.048742243 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:14:18 np0005538960 podman[204182]: 2025-11-28 16:14:18.189119872 +0000 UTC m=+0.094393828 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Nov 28 11:14:23 np0005538960 podman[204278]: 2025-11-28 16:14:23.141688162 +0000 UTC m=+0.051079291 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:14:23 np0005538960 python3.9[204354]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Nov 28 11:14:24 np0005538960 python3.9[204519]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:24 np0005538960 systemd[1]: Started libpod-conmon-36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707.scope.
Nov 28 11:14:24 np0005538960 podman[204520]: 2025-11-28 16:14:24.42979562 +0000 UTC m=+0.111709969 container exec 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible)
Nov 28 11:14:24 np0005538960 podman[204520]: 2025-11-28 16:14:24.440384453 +0000 UTC m=+0.122298842 container exec_died 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:14:24 np0005538960 systemd[1]: libpod-conmon-36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707.scope: Deactivated successfully.
Nov 28 11:14:25 np0005538960 python3.9[204704]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:25 np0005538960 systemd[1]: Started libpod-conmon-36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707.scope.
Nov 28 11:14:25 np0005538960 podman[204705]: 2025-11-28 16:14:25.368562851 +0000 UTC m=+0.082965044 container exec 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:14:25 np0005538960 podman[204705]: 2025-11-28 16:14:25.399867679 +0000 UTC m=+0.114269842 container exec_died 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 11:14:25 np0005538960 systemd[1]: libpod-conmon-36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707.scope: Deactivated successfully.
Nov 28 11:14:26 np0005538960 python3.9[204887]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:27 np0005538960 python3.9[205039]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Nov 28 11:14:27 np0005538960 podman[205176]: 2025-11-28 16:14:27.738282692 +0000 UTC m=+0.071158421 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm)
Nov 28 11:14:27 np0005538960 python3.9[205221]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:28 np0005538960 systemd[1]: Started libpod-conmon-a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4.scope.
Nov 28 11:14:28 np0005538960 podman[205226]: 2025-11-28 16:14:28.896200992 +0000 UTC m=+0.936497386 container exec a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:14:28 np0005538960 podman[205226]: 2025-11-28 16:14:28.932532246 +0000 UTC m=+0.972828570 container exec_died a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:14:28 np0005538960 systemd[1]: libpod-conmon-a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4.scope: Deactivated successfully.
Nov 28 11:14:29 np0005538960 python3.9[205409]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:30 np0005538960 systemd[1]: Started libpod-conmon-a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4.scope.
Nov 28 11:14:30 np0005538960 podman[205410]: 2025-11-28 16:14:30.340626856 +0000 UTC m=+0.662085883 container exec a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 11:14:30 np0005538960 podman[205410]: 2025-11-28 16:14:30.375395501 +0000 UTC m=+0.696854498 container exec_died a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent)
Nov 28 11:14:30 np0005538960 systemd[1]: libpod-conmon-a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4.scope: Deactivated successfully.
Nov 28 11:14:31 np0005538960 python3.9[205591]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:31 np0005538960 python3.9[205743]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Nov 28 11:14:32 np0005538960 python3.9[205909]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:32 np0005538960 systemd[1]: Started libpod-conmon-7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a.scope.
Nov 28 11:14:32 np0005538960 podman[205910]: 2025-11-28 16:14:32.708922301 +0000 UTC m=+0.077741614 container exec 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:14:32 np0005538960 podman[205910]: 2025-11-28 16:14:32.743251435 +0000 UTC m=+0.112070728 container exec_died 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 11:14:32 np0005538960 systemd[1]: libpod-conmon-7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a.scope: Deactivated successfully.
Nov 28 11:14:33 np0005538960 python3.9[206090]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:33 np0005538960 systemd[1]: Started libpod-conmon-7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a.scope.
Nov 28 11:14:33 np0005538960 podman[206091]: 2025-11-28 16:14:33.528251532 +0000 UTC m=+0.066083473 container exec 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:14:33 np0005538960 podman[206091]: 2025-11-28 16:14:33.563319534 +0000 UTC m=+0.101151485 container exec_died 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:14:33 np0005538960 systemd[1]: libpod-conmon-7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a.scope: Deactivated successfully.
Nov 28 11:14:34 np0005538960 python3.9[206275]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:35 np0005538960 python3.9[206427]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.302 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.303 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:14:35.304 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:14:36 np0005538960 python3.9[206593]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:36 np0005538960 systemd[1]: Started libpod-conmon-1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.scope.
Nov 28 11:14:36 np0005538960 podman[206594]: 2025-11-28 16:14:36.195383718 +0000 UTC m=+0.097844904 container exec 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:14:36 np0005538960 podman[206594]: 2025-11-28 16:14:36.23328822 +0000 UTC m=+0.135749406 container exec_died 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute)
Nov 28 11:14:36 np0005538960 podman[206612]: 2025-11-28 16:14:36.28114158 +0000 UTC m=+0.076911052 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=5, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:14:36 np0005538960 systemd[1]: libpod-conmon-1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.scope: Deactivated successfully.
Nov 28 11:14:36 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Main process exited, code=exited, status=1/FAILURE
Nov 28 11:14:36 np0005538960 systemd[1]: 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683-464ee5ad3ed80444.service: Failed with result 'exit-code'.
Nov 28 11:14:37 np0005538960 python3.9[206797]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:37 np0005538960 systemd[1]: Started libpod-conmon-1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.scope.
Nov 28 11:14:37 np0005538960 podman[206798]: 2025-11-28 16:14:37.107103687 +0000 UTC m=+0.080960533 container exec 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 11:14:37 np0005538960 podman[206798]: 2025-11-28 16:14:37.142478707 +0000 UTC m=+0.116335543 container exec_died 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_id=edpm, tcib_managed=true)
Nov 28 11:14:37 np0005538960 systemd[1]: libpod-conmon-1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683.scope: Deactivated successfully.
Nov 28 11:14:37 np0005538960 python3.9[206981]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:38 np0005538960 podman[207105]: 2025-11-28 16:14:38.575467566 +0000 UTC m=+0.071370125 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:14:38 np0005538960 python3.9[207154]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Nov 28 11:14:39 np0005538960 python3.9[207320]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:39 np0005538960 systemd[1]: Started libpod-conmon-9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8.scope.
Nov 28 11:14:39 np0005538960 podman[207321]: 2025-11-28 16:14:39.744609176 +0000 UTC m=+0.095408623 container exec 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:14:39 np0005538960 podman[207321]: 2025-11-28 16:14:39.777527215 +0000 UTC m=+0.128326572 container exec_died 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:14:39 np0005538960 systemd[1]: libpod-conmon-9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8.scope: Deactivated successfully.
Nov 28 11:14:40 np0005538960 python3.9[207505]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:40 np0005538960 systemd[1]: Started libpod-conmon-9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8.scope.
Nov 28 11:14:40 np0005538960 podman[207506]: 2025-11-28 16:14:40.734120229 +0000 UTC m=+0.076226507 container exec 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:14:40 np0005538960 podman[207506]: 2025-11-28 16:14:40.764489895 +0000 UTC m=+0.106596143 container exec_died 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:14:40 np0005538960 systemd[1]: libpod-conmon-9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8.scope: Deactivated successfully.
Nov 28 11:14:41 np0005538960 python3.9[207688]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:42 np0005538960 python3.9[207840]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Nov 28 11:14:43 np0005538960 podman[207977]: 2025-11-28 16:14:43.12315999 +0000 UTC m=+0.116189270 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Nov 28 11:14:43 np0005538960 python3.9[208025]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:43 np0005538960 systemd[1]: Started libpod-conmon-7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee.scope.
Nov 28 11:14:43 np0005538960 podman[208033]: 2025-11-28 16:14:43.384306994 +0000 UTC m=+0.082780119 container exec 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:14:43 np0005538960 podman[208033]: 2025-11-28 16:14:43.41595101 +0000 UTC m=+0.114424045 container exec_died 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:14:43 np0005538960 systemd[1]: libpod-conmon-7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee.scope: Deactivated successfully.
Nov 28 11:14:44 np0005538960 python3.9[208216]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:44 np0005538960 systemd[1]: Started libpod-conmon-7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee.scope.
Nov 28 11:14:44 np0005538960 podman[208217]: 2025-11-28 16:14:44.236192565 +0000 UTC m=+0.081694882 container exec 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:14:44 np0005538960 podman[208217]: 2025-11-28 16:14:44.267294708 +0000 UTC m=+0.112796985 container exec_died 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:14:44 np0005538960 systemd[1]: libpod-conmon-7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee.scope: Deactivated successfully.
Nov 28 11:14:45 np0005538960 python3.9[208401]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:45 np0005538960 python3.9[208553]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Nov 28 11:14:46 np0005538960 python3.9[208717]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:47 np0005538960 systemd[1]: Started libpod-conmon-1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786.scope.
Nov 28 11:14:47 np0005538960 podman[208718]: 2025-11-28 16:14:47.331047104 +0000 UTC m=+0.620923279 container exec 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, version=9.6, config_id=edpm, release=1755695350, distribution-scope=public, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 11:14:47 np0005538960 podman[208718]: 2025-11-28 16:14:47.33808596 +0000 UTC m=+0.627962165 container exec_died 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, config_id=edpm, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, distribution-scope=public, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 11:14:47 np0005538960 systemd[1]: libpod-conmon-1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786.scope: Deactivated successfully.
Nov 28 11:14:47 np0005538960 podman[208734]: 2025-11-28 16:14:47.420698884 +0000 UTC m=+0.089147998 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 11:14:48 np0005538960 python3.9[208917]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Nov 28 11:14:48 np0005538960 systemd[1]: Started libpod-conmon-1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786.scope.
Nov 28 11:14:48 np0005538960 podman[208918]: 2025-11-28 16:14:48.342861722 +0000 UTC m=+0.098088770 container exec 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 11:14:48 np0005538960 podman[208918]: 2025-11-28 16:14:48.374793606 +0000 UTC m=+0.130020604 container exec_died 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, version=9.6, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Nov 28 11:14:48 np0005538960 systemd[1]: libpod-conmon-1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786.scope: Deactivated successfully.
Nov 28 11:14:48 np0005538960 podman[208935]: 2025-11-28 16:14:48.457138264 +0000 UTC m=+0.112837637 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd)
Nov 28 11:14:49 np0005538960 python3.9[209118]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:50 np0005538960 python3.9[209270]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:51 np0005538960 python3.9[209422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:14:51 np0005538960 python3.9[209545]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1764346490.4675372-3207-82302644983812/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:53 np0005538960 python3.9[209697]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:53 np0005538960 podman[209821]: 2025-11-28 16:14:53.65085062 +0000 UTC m=+0.065776107 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:14:53 np0005538960 python3.9[209871]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:14:54 np0005538960 python3.9[209951]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:55 np0005538960 python3.9[210103]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:14:55 np0005538960 python3.9[210181]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.lbf_kipm recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:56 np0005538960 python3.9[210333]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:14:56 np0005538960 python3.9[210411]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:14:57 np0005538960 python3.9[210563]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:14:58 np0005538960 podman[210589]: 2025-11-28 16:14:58.175741166 +0000 UTC m=+0.072325019 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Nov 28 11:14:58 np0005538960 python3[210738]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Nov 28 11:14:59 np0005538960 python3.9[210890]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:15:00 np0005538960 python3.9[210968]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:15:01 np0005538960 python3.9[211120]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:15:01 np0005538960 python3.9[211198]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:15:02 np0005538960 python3.9[211350]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:15:02 np0005538960 python3.9[211428]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:15:03 np0005538960 python3.9[211580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:15:04 np0005538960 python3.9[211658]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:15:05 np0005538960 python3.9[211810]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Nov 28 11:15:05 np0005538960 python3.9[211935]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1764346504.4643025-3582-193937342415378/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:15:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:15:06.334 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:15:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:15:06.335 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:15:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:15:06.336 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:15:06 np0005538960 podman[212059]: 2025-11-28 16:15:06.622686111 +0000 UTC m=+0.060298570 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 11:15:06 np0005538960 python3.9[212108]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:15:07 np0005538960 python3.9[212260]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:15:08 np0005538960 nova_compute[187252]: 2025-11-28 16:15:08.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:08 np0005538960 nova_compute[187252]: 2025-11-28 16:15:08.397 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:08 np0005538960 python3.9[212417]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:15:09 np0005538960 podman[212518]: 2025-11-28 16:15:09.19182472 +0000 UTC m=+0.084263756 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:15:09 np0005538960 nova_compute[187252]: 2025-11-28 16:15:09.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:09 np0005538960 python3.9[212593]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:15:10 np0005538960 python3.9[212746]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Nov 28 11:15:10 np0005538960 nova_compute[187252]: 2025-11-28 16:15:10.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:10 np0005538960 nova_compute[187252]: 2025-11-28 16:15:10.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:10 np0005538960 nova_compute[187252]: 2025-11-28 16:15:10.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:10 np0005538960 nova_compute[187252]: 2025-11-28 16:15:10.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:15:11 np0005538960 python3.9[212900]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.340 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.340 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.485 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.487 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5954MB free_disk=73.37834930419922GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.487 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.487 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.549 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.550 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.573 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.585 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.586 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:15:11 np0005538960 nova_compute[187252]: 2025-11-28 16:15:11.586 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:15:11 np0005538960 python3.9[213055]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Nov 28 11:15:12 np0005538960 systemd-logind[788]: Session 27 logged out. Waiting for processes to exit.
Nov 28 11:15:12 np0005538960 systemd[1]: session-27.scope: Deactivated successfully.
Nov 28 11:15:12 np0005538960 systemd[1]: session-27.scope: Consumed 1min 52.486s CPU time.
Nov 28 11:15:12 np0005538960 systemd-logind[788]: Removed session 27.
Nov 28 11:15:12 np0005538960 nova_compute[187252]: 2025-11-28 16:15:12.588 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:15:12 np0005538960 nova_compute[187252]: 2025-11-28 16:15:12.588 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:15:12 np0005538960 nova_compute[187252]: 2025-11-28 16:15:12.588 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:15:12 np0005538960 nova_compute[187252]: 2025-11-28 16:15:12.604 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:15:14 np0005538960 podman[213080]: 2025-11-28 16:15:14.239253489 +0000 UTC m=+0.131807609 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 28 11:15:18 np0005538960 podman[213107]: 2025-11-28 16:15:18.149062956 +0000 UTC m=+0.059309159 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:15:19 np0005538960 podman[213126]: 2025-11-28 16:15:19.166648547 +0000 UTC m=+0.070222996 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3)
Nov 28 11:15:24 np0005538960 podman[213147]: 2025-11-28 16:15:24.149146499 +0000 UTC m=+0.049932999 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:15:29 np0005538960 podman[213171]: 2025-11-28 16:15:29.15118063 +0000 UTC m=+0.062314703 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, config_id=edpm, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 11:15:37 np0005538960 podman[213193]: 2025-11-28 16:15:37.18677435 +0000 UTC m=+0.074786918 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 11:15:40 np0005538960 podman[213214]: 2025-11-28 16:15:40.168456714 +0000 UTC m=+0.074292257 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:15:45 np0005538960 podman[213238]: 2025-11-28 16:15:45.199808589 +0000 UTC m=+0.096951868 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 11:15:49 np0005538960 podman[213264]: 2025-11-28 16:15:49.206326499 +0000 UTC m=+0.105057047 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 11:15:49 np0005538960 podman[213283]: 2025-11-28 16:15:49.312335347 +0000 UTC m=+0.091601978 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 11:15:53 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:15:53.257 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:15:53 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:15:53.258 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:15:53 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:15:53.260 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:15:55 np0005538960 podman[213308]: 2025-11-28 16:15:55.194500199 +0000 UTC m=+0.090637704 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:16:00 np0005538960 podman[213334]: 2025-11-28 16:16:00.207627891 +0000 UTC m=+0.105444517 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, managed_by=edpm_ansible, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public)
Nov 28 11:16:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:16:06.336 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:16:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:16:06.336 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:16:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:16:06.336 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:16:07 np0005538960 nova_compute[187252]: 2025-11-28 16:16:07.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:07 np0005538960 nova_compute[187252]: 2025-11-28 16:16:07.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 11:16:07 np0005538960 nova_compute[187252]: 2025-11-28 16:16:07.329 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 11:16:07 np0005538960 nova_compute[187252]: 2025-11-28 16:16:07.330 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:07 np0005538960 nova_compute[187252]: 2025-11-28 16:16:07.330 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 11:16:07 np0005538960 nova_compute[187252]: 2025-11-28 16:16:07.338 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:08 np0005538960 podman[213357]: 2025-11-28 16:16:08.160296376 +0000 UTC m=+0.066515266 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:16:09 np0005538960 nova_compute[187252]: 2025-11-28 16:16:09.345 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:10 np0005538960 nova_compute[187252]: 2025-11-28 16:16:10.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:10 np0005538960 nova_compute[187252]: 2025-11-28 16:16:10.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:10 np0005538960 nova_compute[187252]: 2025-11-28 16:16:10.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:10 np0005538960 nova_compute[187252]: 2025-11-28 16:16:10.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:16:11 np0005538960 podman[213377]: 2025-11-28 16:16:11.149611685 +0000 UTC m=+0.054940763 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:16:11 np0005538960 nova_compute[187252]: 2025-11-28 16:16:11.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:12 np0005538960 nova_compute[187252]: 2025-11-28 16:16:12.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:12 np0005538960 nova_compute[187252]: 2025-11-28 16:16:12.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:16:12 np0005538960 nova_compute[187252]: 2025-11-28 16:16:12.317 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:16:12 np0005538960 nova_compute[187252]: 2025-11-28 16:16:12.337 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:16:12 np0005538960 nova_compute[187252]: 2025-11-28 16:16:12.338 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.338 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.503 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.504 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6026MB free_disk=73.3788070678711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.504 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.505 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.616 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.617 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.679 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.699 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.700 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.748 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.771 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.792 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.805 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.807 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:16:13 np0005538960 nova_compute[187252]: 2025-11-28 16:16:13.807 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:16:16 np0005538960 podman[213402]: 2025-11-28 16:16:16.203101251 +0000 UTC m=+0.111009913 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 11:16:20 np0005538960 podman[213428]: 2025-11-28 16:16:20.158371544 +0000 UTC m=+0.060912301 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:16:20 np0005538960 podman[213429]: 2025-11-28 16:16:20.193917498 +0000 UTC m=+0.090944831 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 11:16:26 np0005538960 podman[213466]: 2025-11-28 16:16:26.167144395 +0000 UTC m=+0.076754127 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:16:31 np0005538960 podman[213492]: 2025-11-28 16:16:31.160002514 +0000 UTC m=+0.069266325 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:16:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:16:39 np0005538960 podman[213513]: 2025-11-28 16:16:39.197537361 +0000 UTC m=+0.096910656 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Nov 28 11:16:42 np0005538960 podman[213534]: 2025-11-28 16:16:42.155024221 +0000 UTC m=+0.059092186 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:16:47 np0005538960 podman[213560]: 2025-11-28 16:16:47.20628327 +0000 UTC m=+0.115430885 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:16:51 np0005538960 podman[213587]: 2025-11-28 16:16:51.184239611 +0000 UTC m=+0.073684341 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:16:51 np0005538960 podman[213586]: 2025-11-28 16:16:51.187588793 +0000 UTC m=+0.078993321 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 11:16:57 np0005538960 podman[213624]: 2025-11-28 16:16:57.179507453 +0000 UTC m=+0.082822033 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:17:02 np0005538960 podman[213648]: 2025-11-28 16:17:02.158931615 +0000 UTC m=+0.064121088 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vendor=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, build-date=2025-08-20T13:12:41)
Nov 28 11:17:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:17:06.337 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:17:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:17:06.338 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:17:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:17:06.338 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:17:10 np0005538960 podman[213670]: 2025-11-28 16:17:10.181148301 +0000 UTC m=+0.084549315 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 11:17:11 np0005538960 nova_compute[187252]: 2025-11-28 16:17:11.802 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:11 np0005538960 nova_compute[187252]: 2025-11-28 16:17:11.818 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:11 np0005538960 nova_compute[187252]: 2025-11-28 16:17:11.818 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:11 np0005538960 nova_compute[187252]: 2025-11-28 16:17:11.818 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:11 np0005538960 nova_compute[187252]: 2025-11-28 16:17:11.818 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:17:12 np0005538960 nova_compute[187252]: 2025-11-28 16:17:12.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:12 np0005538960 nova_compute[187252]: 2025-11-28 16:17:12.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:17:12 np0005538960 nova_compute[187252]: 2025-11-28 16:17:12.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:17:12 np0005538960 nova_compute[187252]: 2025-11-28 16:17:12.330 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:17:12 np0005538960 nova_compute[187252]: 2025-11-28 16:17:12.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:13 np0005538960 podman[213692]: 2025-11-28 16:17:13.154532998 +0000 UTC m=+0.064397776 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.340 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.496 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.498 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6041MB free_disk=73.37878799438477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.498 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.499 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.566 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.567 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.591 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.605 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.607 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:17:13 np0005538960 nova_compute[187252]: 2025-11-28 16:17:13.607 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:17:14 np0005538960 nova_compute[187252]: 2025-11-28 16:17:14.603 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:14 np0005538960 nova_compute[187252]: 2025-11-28 16:17:14.605 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:17:18 np0005538960 podman[213718]: 2025-11-28 16:17:18.246137625 +0000 UTC m=+0.133883953 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:17:22 np0005538960 podman[213747]: 2025-11-28 16:17:22.949604783 +0000 UTC m=+0.089177177 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 11:17:22 np0005538960 podman[213744]: 2025-11-28 16:17:22.977747027 +0000 UTC m=+0.116694376 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 28 11:17:28 np0005538960 podman[213785]: 2025-11-28 16:17:28.176896958 +0000 UTC m=+0.074384928 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:17:32 np0005538960 podman[213811]: 2025-11-28 16:17:32.57015263 +0000 UTC m=+0.087036037 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vendor=Red Hat, Inc.)
Nov 28 11:17:41 np0005538960 podman[213832]: 2025-11-28 16:17:41.170245612 +0000 UTC m=+0.079682756 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:17:44 np0005538960 podman[213852]: 2025-11-28 16:17:44.157929586 +0000 UTC m=+0.061360352 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:17:49 np0005538960 podman[213877]: 2025-11-28 16:17:49.188931612 +0000 UTC m=+0.088389499 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:17:52 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:17:52.242 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:17:52 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:17:52.244 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:17:53 np0005538960 podman[213907]: 2025-11-28 16:17:53.154003631 +0000 UTC m=+0.058748159 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:17:53 np0005538960 podman[213908]: 2025-11-28 16:17:53.159806902 +0000 UTC m=+0.055931620 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:17:59 np0005538960 podman[213945]: 2025-11-28 16:17:59.154062749 +0000 UTC m=+0.063609797 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:18:00 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:00.247 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:18:03 np0005538960 podman[213969]: 2025-11-28 16:18:03.181239036 +0000 UTC m=+0.088998922 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, name=ubi9-minimal, release=1755695350, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., architecture=x86_64, container_name=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:18:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:06.338 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:06.338 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:06.338 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:11 np0005538960 nova_compute[187252]: 2025-11-28 16:18:11.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:12 np0005538960 podman[213991]: 2025-11-28 16:18:12.181070783 +0000 UTC m=+0.080769233 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 28 11:18:12 np0005538960 nova_compute[187252]: 2025-11-28 16:18:12.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:12 np0005538960 nova_compute[187252]: 2025-11-28 16:18:12.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.335 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.335 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.336 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.359 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.359 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.359 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.360 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.514 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.515 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=6052MB free_disk=73.3788070678711GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.515 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.516 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.603 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.604 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.666 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.683 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.684 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:18:13 np0005538960 nova_compute[187252]: 2025-11-28 16:18:13.685 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:14 np0005538960 nova_compute[187252]: 2025-11-28 16:18:14.663 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:14 np0005538960 nova_compute[187252]: 2025-11-28 16:18:14.664 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:15 np0005538960 podman[214012]: 2025-11-28 16:18:15.145378589 +0000 UTC m=+0.047279139 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:18:15 np0005538960 nova_compute[187252]: 2025-11-28 16:18:15.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:16 np0005538960 nova_compute[187252]: 2025-11-28 16:18:16.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:18:20 np0005538960 podman[214038]: 2025-11-28 16:18:20.176502537 +0000 UTC m=+0.085022196 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:18:24 np0005538960 podman[214065]: 2025-11-28 16:18:24.261874819 +0000 UTC m=+0.060484710 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 11:18:24 np0005538960 podman[214064]: 2025-11-28 16:18:24.29769282 +0000 UTC m=+0.098070973 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:18:26 np0005538960 nova_compute[187252]: 2025-11-28 16:18:26.969 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "eff28834-4c5b-46d0-90a8-4be63b9fff80" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:26 np0005538960 nova_compute[187252]: 2025-11-28 16:18:26.970 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:26 np0005538960 nova_compute[187252]: 2025-11-28 16:18:26.993 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.093 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.093 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.100 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.100 187256 INFO nova.compute.claims [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.249 187256 DEBUG nova.compute.provider_tree [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.261 187256 DEBUG nova.scheduler.client.report [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.282 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.282 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.344 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.345 187256 DEBUG nova.network.neutron [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.367 187256 INFO nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.553 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.653 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.655 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.655 187256 INFO nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Creating image(s)#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.656 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "/var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.656 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.657 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.657 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:27 np0005538960 nova_compute[187252]: 2025-11-28 16:18:27.658 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:28 np0005538960 nova_compute[187252]: 2025-11-28 16:18:28.640 187256 WARNING oslo_policy.policy [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 28 11:18:28 np0005538960 nova_compute[187252]: 2025-11-28 16:18:28.641 187256 WARNING oslo_policy.policy [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Nov 28 11:18:28 np0005538960 nova_compute[187252]: 2025-11-28 16:18:28.644 187256 DEBUG nova.policy [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:18:30 np0005538960 podman[214103]: 2025-11-28 16:18:30.145725795 +0000 UTC m=+0.052537942 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.564 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.678 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc.part --force-share --output=json" returned: 0 in 0.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.679 187256 DEBUG nova.virt.images [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] 48a87826-de14-4dde-9157-9baf2160cd7d was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.680 187256 DEBUG nova.privsep.utils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.681 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc.part /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.907 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc.part /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc.converted" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.912 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.971 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.972 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 3.315s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:30 np0005538960 nova_compute[187252]: 2025-11-28 16:18:30.984 187256 INFO oslo.privsep.daemon [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpfo1bjubr/privsep.sock']#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.706 187256 INFO oslo.privsep.daemon [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.543 214150 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.549 214150 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.552 214150 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.552 214150 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214150#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.800 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.857 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.858 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.859 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.873 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.948 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.950 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.990 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.991 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.132s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:31 np0005538960 nova_compute[187252]: 2025-11-28 16:18:31.992 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.062 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.064 187256 DEBUG nova.virt.disk.api [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Checking if we can resize image /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.065 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.128 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.129 187256 DEBUG nova.virt.disk.api [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Cannot resize image /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.129 187256 DEBUG nova.objects.instance [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'migration_context' on Instance uuid eff28834-4c5b-46d0-90a8-4be63b9fff80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.143 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.144 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Ensure instance console log exists: /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.144 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.145 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.145 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:32 np0005538960 nova_compute[187252]: 2025-11-28 16:18:32.467 187256 DEBUG nova.network.neutron [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Successfully created port: 6b498512-32dd-4e59-95bd-71c3a69bb44f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:18:34 np0005538960 podman[214167]: 2025-11-28 16:18:34.177235262 +0000 UTC m=+0.074280058 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.306 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.307 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:18:35.308 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:18:35 np0005538960 nova_compute[187252]: 2025-11-28 16:18:35.374 187256 DEBUG nova.network.neutron [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Successfully updated port: 6b498512-32dd-4e59-95bd-71c3a69bb44f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:18:35 np0005538960 nova_compute[187252]: 2025-11-28 16:18:35.397 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:18:35 np0005538960 nova_compute[187252]: 2025-11-28 16:18:35.397 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquired lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:18:35 np0005538960 nova_compute[187252]: 2025-11-28 16:18:35.398 187256 DEBUG nova.network.neutron [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:18:35 np0005538960 nova_compute[187252]: 2025-11-28 16:18:35.725 187256 DEBUG nova.network.neutron [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:18:37 np0005538960 nova_compute[187252]: 2025-11-28 16:18:37.442 187256 DEBUG nova.compute.manager [req-d924870c-386f-4a79-8c1d-66077a4e66c4 req-86658a41-75c0-4aad-b5f5-4a5f20cd016f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-changed-6b498512-32dd-4e59-95bd-71c3a69bb44f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:18:37 np0005538960 nova_compute[187252]: 2025-11-28 16:18:37.443 187256 DEBUG nova.compute.manager [req-d924870c-386f-4a79-8c1d-66077a4e66c4 req-86658a41-75c0-4aad-b5f5-4a5f20cd016f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Refreshing instance network info cache due to event network-changed-6b498512-32dd-4e59-95bd-71c3a69bb44f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:18:37 np0005538960 nova_compute[187252]: 2025-11-28 16:18:37.443 187256 DEBUG oslo_concurrency.lockutils [req-d924870c-386f-4a79-8c1d-66077a4e66c4 req-86658a41-75c0-4aad-b5f5-4a5f20cd016f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.232 187256 DEBUG nova.network.neutron [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updating instance_info_cache with network_info: [{"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.514 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Releasing lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.514 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Instance network_info: |[{"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.515 187256 DEBUG oslo_concurrency.lockutils [req-d924870c-386f-4a79-8c1d-66077a4e66c4 req-86658a41-75c0-4aad-b5f5-4a5f20cd016f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.515 187256 DEBUG nova.network.neutron [req-d924870c-386f-4a79-8c1d-66077a4e66c4 req-86658a41-75c0-4aad-b5f5-4a5f20cd016f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Refreshing network info cache for port 6b498512-32dd-4e59-95bd-71c3a69bb44f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.519 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Start _get_guest_xml network_info=[{"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.524 187256 WARNING nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.531 187256 DEBUG nova.virt.libvirt.host [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.532 187256 DEBUG nova.virt.libvirt.host [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.536 187256 DEBUG nova.virt.libvirt.host [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.536 187256 DEBUG nova.virt.libvirt.host [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.538 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.538 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.538 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.539 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.539 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.539 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.539 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.539 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.539 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.540 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.540 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.540 187256 DEBUG nova.virt.hardware [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.544 187256 DEBUG nova.privsep.utils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.545 187256 DEBUG nova.virt.libvirt.vif [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-5027134',display_name='tempest-TestNetworkBasicOps-server-5027134',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-5027134',id=3,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE8qss00IgI7JJJCpwpUtOPNZOKd3VbUa+RvJPQz0L1gQHQ/A/taf962uh+XblnyqL6/863JdV1hTqTrPBvdsLWy2S9tfii7CzAhBQLzdbaC8IXYKznxQHwcqbW6UCp+cA==',key_name='tempest-TestNetworkBasicOps-197299468',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-ngyv601g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:18:27Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=eff28834-4c5b-46d0-90a8-4be63b9fff80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.545 187256 DEBUG nova.network.os_vif_util [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.546 187256 DEBUG nova.network.os_vif_util [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:00:db,bridge_name='br-int',has_traffic_filtering=True,id=6b498512-32dd-4e59-95bd-71c3a69bb44f,network=Network(e779c78f-4948-46c1-a91a-4b1068ceaae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b498512-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.547 187256 DEBUG nova.objects.instance [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'pci_devices' on Instance uuid eff28834-4c5b-46d0-90a8-4be63b9fff80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.560 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <uuid>eff28834-4c5b-46d0-90a8-4be63b9fff80</uuid>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <name>instance-00000003</name>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkBasicOps-server-5027134</nova:name>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:18:39</nova:creationTime>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        <nova:port uuid="6b498512-32dd-4e59-95bd-71c3a69bb44f">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <entry name="serial">eff28834-4c5b-46d0-90a8-4be63b9fff80</entry>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <entry name="uuid">eff28834-4c5b-46d0-90a8-4be63b9fff80</entry>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.config"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:f3:00:db"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <target dev="tap6b498512-32"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/console.log" append="off"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:18:39 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:18:39 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:18:39 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:18:39 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.562 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Preparing to wait for external event network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.562 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.562 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.562 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.563 187256 DEBUG nova.virt.libvirt.vif [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-5027134',display_name='tempest-TestNetworkBasicOps-server-5027134',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-5027134',id=3,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE8qss00IgI7JJJCpwpUtOPNZOKd3VbUa+RvJPQz0L1gQHQ/A/taf962uh+XblnyqL6/863JdV1hTqTrPBvdsLWy2S9tfii7CzAhBQLzdbaC8IXYKznxQHwcqbW6UCp+cA==',key_name='tempest-TestNetworkBasicOps-197299468',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-ngyv601g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:18:27Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=eff28834-4c5b-46d0-90a8-4be63b9fff80,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.563 187256 DEBUG nova.network.os_vif_util [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.564 187256 DEBUG nova.network.os_vif_util [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:00:db,bridge_name='br-int',has_traffic_filtering=True,id=6b498512-32dd-4e59-95bd-71c3a69bb44f,network=Network(e779c78f-4948-46c1-a91a-4b1068ceaae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b498512-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.564 187256 DEBUG os_vif [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:00:db,bridge_name='br-int',has_traffic_filtering=True,id=6b498512-32dd-4e59-95bd-71c3a69bb44f,network=Network(e779c78f-4948-46c1-a91a-4b1068ceaae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b498512-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.597 187256 DEBUG ovsdbapp.backend.ovs_idl [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.598 187256 DEBUG ovsdbapp.backend.ovs_idl [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.598 187256 DEBUG ovsdbapp.backend.ovs_idl [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.598 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.599 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.599 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.600 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.601 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.603 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.613 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.614 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.614 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:18:39 np0005538960 nova_compute[187252]: 2025-11-28 16:18:39.615 187256 INFO oslo.privsep.daemon [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpewjd2tm9/privsep.sock']#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.400 187256 INFO oslo.privsep.daemon [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.217 214193 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.222 214193 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.224 214193 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.224 214193 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214193#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.713 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.714 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b498512-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.715 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b498512-32, col_values=(('external_ids', {'iface-id': '6b498512-32dd-4e59-95bd-71c3a69bb44f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:00:db', 'vm-uuid': 'eff28834-4c5b-46d0-90a8-4be63b9fff80'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.717 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:40 np0005538960 NetworkManager[55548]: <info>  [1764346720.7199] manager: (tap6b498512-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.721 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.726 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.728 187256 INFO os_vif [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:00:db,bridge_name='br-int',has_traffic_filtering=True,id=6b498512-32dd-4e59-95bd-71c3a69bb44f,network=Network(e779c78f-4948-46c1-a91a-4b1068ceaae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b498512-32')#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.779 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.779 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.780 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No VIF found with MAC fa:16:3e:f3:00:db, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:18:40 np0005538960 nova_compute[187252]: 2025-11-28 16:18:40.780 187256 INFO nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Using config drive#033[00m
Nov 28 11:18:41 np0005538960 nova_compute[187252]: 2025-11-28 16:18:41.432 187256 INFO nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Creating config drive at /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.config#033[00m
Nov 28 11:18:41 np0005538960 nova_compute[187252]: 2025-11-28 16:18:41.440 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4d6armhy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:18:41 np0005538960 nova_compute[187252]: 2025-11-28 16:18:41.567 187256 DEBUG oslo_concurrency.processutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4d6armhy" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:18:41 np0005538960 kernel: tun: Universal TUN/TAP device driver, 1.6
Nov 28 11:18:41 np0005538960 kernel: tap6b498512-32: entered promiscuous mode
Nov 28 11:18:41 np0005538960 NetworkManager[55548]: <info>  [1764346721.6512] manager: (tap6b498512-32): new Tun device (/org/freedesktop/NetworkManager/Devices/22)
Nov 28 11:18:41 np0005538960 ovn_controller[95460]: 2025-11-28T16:18:41Z|00027|binding|INFO|Claiming lport 6b498512-32dd-4e59-95bd-71c3a69bb44f for this chassis.
Nov 28 11:18:41 np0005538960 ovn_controller[95460]: 2025-11-28T16:18:41Z|00028|binding|INFO|6b498512-32dd-4e59-95bd-71c3a69bb44f: Claiming fa:16:3e:f3:00:db 10.100.0.14
Nov 28 11:18:41 np0005538960 nova_compute[187252]: 2025-11-28 16:18:41.653 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:41 np0005538960 nova_compute[187252]: 2025-11-28 16:18:41.659 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:41.675 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:00:db 10.100.0.14'], port_security=['fa:16:3e:f3:00:db 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e779c78f-4948-46c1-a91a-4b1068ceaae1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '2', 'neutron:security_group_ids': '870cc982-353e-41b1-a555-537b285e8e4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb853e10-1f64-4b13-bf92-c660af7671ba, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=6b498512-32dd-4e59-95bd-71c3a69bb44f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:18:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:41.677 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 6b498512-32dd-4e59-95bd-71c3a69bb44f in datapath e779c78f-4948-46c1-a91a-4b1068ceaae1 bound to our chassis#033[00m
Nov 28 11:18:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:41.679 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e779c78f-4948-46c1-a91a-4b1068ceaae1#033[00m
Nov 28 11:18:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:41.680 104369 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpsc6ko7ke/privsep.sock']#033[00m
Nov 28 11:18:41 np0005538960 systemd-udevd[214218]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:18:41 np0005538960 NetworkManager[55548]: <info>  [1764346721.7079] device (tap6b498512-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:18:41 np0005538960 NetworkManager[55548]: <info>  [1764346721.7090] device (tap6b498512-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:18:41 np0005538960 systemd-machined[153518]: New machine qemu-1-instance-00000003.
Nov 28 11:18:41 np0005538960 ovn_controller[95460]: 2025-11-28T16:18:41Z|00029|binding|INFO|Setting lport 6b498512-32dd-4e59-95bd-71c3a69bb44f ovn-installed in OVS
Nov 28 11:18:41 np0005538960 ovn_controller[95460]: 2025-11-28T16:18:41Z|00030|binding|INFO|Setting lport 6b498512-32dd-4e59-95bd-71c3a69bb44f up in Southbound
Nov 28 11:18:41 np0005538960 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Nov 28 11:18:41 np0005538960 nova_compute[187252]: 2025-11-28 16:18:41.786 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.293 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.394 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346722.394232, eff28834-4c5b-46d0-90a8-4be63b9fff80 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.396 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] VM Started (Lifecycle Event)#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.412 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.417 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346722.3956385, eff28834-4c5b-46d0-90a8-4be63b9fff80 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.417 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.434 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.439 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:18:42 np0005538960 nova_compute[187252]: 2025-11-28 16:18:42.455 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:18:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:42.627 104369 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 11:18:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:42.628 104369 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpsc6ko7ke/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 28 11:18:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:42.419 214244 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 11:18:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:42.425 214244 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 11:18:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:42.427 214244 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Nov 28 11:18:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:42.427 214244 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214244#033[00m
Nov 28 11:18:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:42.631 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[006b0fe2-48ed-4386-87a6-88c424a84196]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:43 np0005538960 podman[214249]: 2025-11-28 16:18:43.21178361 +0000 UTC m=+0.102674526 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.225 214244 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.225 214244 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.226 214244 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.862 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[924a05e1-c28b-4ce2-8e51-fa2ad7f8fe92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.863 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape779c78f-41 in ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.864 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape779c78f-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.864 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[83f15444-4445-4e3b-b0de-b2bd9aac90ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.867 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e590df07-fbea-40ec-9d9c-f3ed3fc6ee50]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:43.917 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[a734d20d-0365-48fc-8f1d-6d5a6d1910d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.080 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca72b6a-839f-4953-8aa8-d535f3207d0a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.082 104369 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp2xz_v2wf/privsep.sock']#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.866 104369 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.867 104369 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp2xz_v2wf/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.723 214278 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.727 214278 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.730 214278 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.730 214278 INFO oslo.privsep.daemon [-] privsep daemon running as pid 214278#033[00m
Nov 28 11:18:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:44.870 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[19bd607b-1ff6-4664-935d-a5ea828269f3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:45 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:45.366 214278 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:45 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:45.367 214278 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:45 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:45.367 214278 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:45 np0005538960 nova_compute[187252]: 2025-11-28 16:18:45.719 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.025 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7df716-c6c4-4b3d-9b90-cea66c5da7e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.048 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b481b1-c7ed-4ce6-8cfd-8025e666ab53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 NetworkManager[55548]: <info>  [1764346726.0492] manager: (tape779c78f-40): new Veth device (/org/freedesktop/NetworkManager/Devices/23)
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.090 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[32d6b9b5-a1bd-4c07-b193-7aa0a76666ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.094 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[d85418cd-44d9-4324-8533-bcf6163ad961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 systemd-udevd[214295]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:18:46 np0005538960 NetworkManager[55548]: <info>  [1764346726.1191] device (tape779c78f-40): carrier: link connected
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.126 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[ae254f87-5eea-4a54-a64d-610d1b7ef8ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.160 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2e94f3cd-4edb-4369-b47a-fcaf93a83c28]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape779c78f-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:7c:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382569, 'reachable_time': 43711, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214321, 'error': None, 'target': 'ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 podman[214287]: 2025-11-28 16:18:46.167989904 +0000 UTC m=+0.083185253 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.178 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[58ca7787-0df5-4382-b33b-90d46c92af17]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe12:7c45'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382569, 'tstamp': 382569}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214331, 'error': None, 'target': 'ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.199 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[36ad923e-0aaa-4209-ae53-efa8bc1f643d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape779c78f-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:12:7c:45'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 12], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382569, 'reachable_time': 43711, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214332, 'error': None, 'target': 'ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.254 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fd81e3f3-69d1-4a44-8606-953f677ee695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.305 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2a4f7c57-9ae6-4ee8-947a-75e30d0a5c83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.307 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape779c78f-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.308 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.308 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape779c78f-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:18:46 np0005538960 nova_compute[187252]: 2025-11-28 16:18:46.310 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:46 np0005538960 kernel: tape779c78f-40: entered promiscuous mode
Nov 28 11:18:46 np0005538960 NetworkManager[55548]: <info>  [1764346726.3121] manager: (tape779c78f-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Nov 28 11:18:46 np0005538960 nova_compute[187252]: 2025-11-28 16:18:46.313 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.314 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape779c78f-40, col_values=(('external_ids', {'iface-id': '8da911b6-4b06-444b-b895-eebe136e2189'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:18:46 np0005538960 nova_compute[187252]: 2025-11-28 16:18:46.315 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:18:46Z|00031|binding|INFO|Releasing lport 8da911b6-4b06-444b-b895-eebe136e2189 from this chassis (sb_readonly=0)
Nov 28 11:18:46 np0005538960 nova_compute[187252]: 2025-11-28 16:18:46.317 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.318 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e779c78f-4948-46c1-a91a-4b1068ceaae1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e779c78f-4948-46c1-a91a-4b1068ceaae1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.319 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1ae7c37f-475d-4bc0-974f-6058ecb62779]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.321 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-e779c78f-4948-46c1-a91a-4b1068ceaae1
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/e779c78f-4948-46c1-a91a-4b1068ceaae1.pid.haproxy
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID e779c78f-4948-46c1-a91a-4b1068ceaae1
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:18:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:46.322 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1', 'env', 'PROCESS_TAG=haproxy-e779c78f-4948-46c1-a91a-4b1068ceaae1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e779c78f-4948-46c1-a91a-4b1068ceaae1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:18:46 np0005538960 nova_compute[187252]: 2025-11-28 16:18:46.328 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:46 np0005538960 podman[214363]: 2025-11-28 16:18:46.828382825 +0000 UTC m=+0.105153795 container create 383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:18:46 np0005538960 podman[214363]: 2025-11-28 16:18:46.783114849 +0000 UTC m=+0.059885809 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:18:46 np0005538960 systemd[1]: Started libpod-conmon-383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9.scope.
Nov 28 11:18:46 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:18:46 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1af84f5093ee7d3f1aa174f428b166794890ba043550a3c1d03954af8115883b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:18:46 np0005538960 podman[214363]: 2025-11-28 16:18:46.938230403 +0000 UTC m=+0.215001413 container init 383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:18:46 np0005538960 podman[214363]: 2025-11-28 16:18:46.99056083 +0000 UTC m=+0.267331810 container start 383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 11:18:47 np0005538960 neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1[214379]: [NOTICE]   (214383) : New worker (214385) forked
Nov 28 11:18:47 np0005538960 neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1[214379]: [NOTICE]   (214383) : Loading success.
Nov 28 11:18:47 np0005538960 nova_compute[187252]: 2025-11-28 16:18:47.338 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:48 np0005538960 nova_compute[187252]: 2025-11-28 16:18:48.516 187256 DEBUG nova.network.neutron [req-d924870c-386f-4a79-8c1d-66077a4e66c4 req-86658a41-75c0-4aad-b5f5-4a5f20cd016f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updated VIF entry in instance network info cache for port 6b498512-32dd-4e59-95bd-71c3a69bb44f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:18:48 np0005538960 nova_compute[187252]: 2025-11-28 16:18:48.517 187256 DEBUG nova.network.neutron [req-d924870c-386f-4a79-8c1d-66077a4e66c4 req-86658a41-75c0-4aad-b5f5-4a5f20cd016f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updating instance_info_cache with network_info: [{"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:18:48 np0005538960 nova_compute[187252]: 2025-11-28 16:18:48.537 187256 DEBUG oslo_concurrency.lockutils [req-d924870c-386f-4a79-8c1d-66077a4e66c4 req-86658a41-75c0-4aad-b5f5-4a5f20cd016f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:18:50 np0005538960 nova_compute[187252]: 2025-11-28 16:18:50.730 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.273 187256 DEBUG nova.compute.manager [req-291ba62e-ccc9-4734-a44e-f7bda0e3eac4 req-a88cb6b5-0d00-460e-8e3a-73c75414631a 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.274 187256 DEBUG oslo_concurrency.lockutils [req-291ba62e-ccc9-4734-a44e-f7bda0e3eac4 req-a88cb6b5-0d00-460e-8e3a-73c75414631a 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.274 187256 DEBUG oslo_concurrency.lockutils [req-291ba62e-ccc9-4734-a44e-f7bda0e3eac4 req-a88cb6b5-0d00-460e-8e3a-73c75414631a 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.274 187256 DEBUG oslo_concurrency.lockutils [req-291ba62e-ccc9-4734-a44e-f7bda0e3eac4 req-a88cb6b5-0d00-460e-8e3a-73c75414631a 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.274 187256 DEBUG nova.compute.manager [req-291ba62e-ccc9-4734-a44e-f7bda0e3eac4 req-a88cb6b5-0d00-460e-8e3a-73c75414631a 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Processing event network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.275 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.279 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346731.2790549, eff28834-4c5b-46d0-90a8-4be63b9fff80 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.280 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.281 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.291 187256 INFO nova.virt.libvirt.driver [-] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Instance spawned successfully.#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.292 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.304 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.310 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.312 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.313 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.313 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.313 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.314 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.314 187256 DEBUG nova.virt.libvirt.driver [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:18:51 np0005538960 nova_compute[187252]: 2025-11-28 16:18:51.331 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:18:51 np0005538960 podman[214394]: 2025-11-28 16:18:51.341500974 +0000 UTC m=+0.198033882 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller)
Nov 28 11:18:52 np0005538960 nova_compute[187252]: 2025-11-28 16:18:52.094 187256 INFO nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Took 24.44 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:18:52 np0005538960 nova_compute[187252]: 2025-11-28 16:18:52.095 187256 DEBUG nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:18:52 np0005538960 nova_compute[187252]: 2025-11-28 16:18:52.341 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:52 np0005538960 nova_compute[187252]: 2025-11-28 16:18:52.546 187256 INFO nova.compute.manager [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Took 25.49 seconds to build instance.#033[00m
Nov 28 11:18:52 np0005538960 nova_compute[187252]: 2025-11-28 16:18:52.733 187256 DEBUG oslo_concurrency.lockutils [None req-0c46cc14-1f76-441b-8111-bf3da5c84632 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 25.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:55 np0005538960 podman[214423]: 2025-11-28 16:18:55.175198112 +0000 UTC m=+0.066959581 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:18:55 np0005538960 podman[214422]: 2025-11-28 16:18:55.186889206 +0000 UTC m=+0.081267088 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:18:55 np0005538960 nova_compute[187252]: 2025-11-28 16:18:55.216 187256 DEBUG nova.compute.manager [req-cb30799f-4ba6-4275-b845-6c69eea3630d req-254aa293-ea04-4fc6-9909-263dd52b1fe3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:18:55 np0005538960 nova_compute[187252]: 2025-11-28 16:18:55.216 187256 DEBUG oslo_concurrency.lockutils [req-cb30799f-4ba6-4275-b845-6c69eea3630d req-254aa293-ea04-4fc6-9909-263dd52b1fe3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:18:55 np0005538960 nova_compute[187252]: 2025-11-28 16:18:55.216 187256 DEBUG oslo_concurrency.lockutils [req-cb30799f-4ba6-4275-b845-6c69eea3630d req-254aa293-ea04-4fc6-9909-263dd52b1fe3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:18:55 np0005538960 nova_compute[187252]: 2025-11-28 16:18:55.217 187256 DEBUG oslo_concurrency.lockutils [req-cb30799f-4ba6-4275-b845-6c69eea3630d req-254aa293-ea04-4fc6-9909-263dd52b1fe3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:18:55 np0005538960 nova_compute[187252]: 2025-11-28 16:18:55.217 187256 DEBUG nova.compute.manager [req-cb30799f-4ba6-4275-b845-6c69eea3630d req-254aa293-ea04-4fc6-9909-263dd52b1fe3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] No waiting events found dispatching network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:18:55 np0005538960 nova_compute[187252]: 2025-11-28 16:18:55.217 187256 WARNING nova.compute.manager [req-cb30799f-4ba6-4275-b845-6c69eea3630d req-254aa293-ea04-4fc6-9909-263dd52b1fe3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received unexpected event network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f for instance with vm_state active and task_state None.#033[00m
Nov 28 11:18:55 np0005538960 nova_compute[187252]: 2025-11-28 16:18:55.590 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:55.590 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:18:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:55.592 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:18:55 np0005538960 nova_compute[187252]: 2025-11-28 16:18:55.732 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:57 np0005538960 nova_compute[187252]: 2025-11-28 16:18:57.343 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:18:58 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:18:58.593 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:00 np0005538960 nova_compute[187252]: 2025-11-28 16:19:00.736 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.121 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "355865cd-5e10-47f6-9353-89df93d33afb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.122 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.149 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:19:01 np0005538960 podman[214457]: 2025-11-28 16:19:01.17553701 +0000 UTC m=+0.079323621 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.296 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.297 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.308 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.309 187256 INFO nova.compute.claims [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.519 187256 DEBUG nova.compute.provider_tree [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.579 187256 ERROR nova.scheduler.client.report [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [req-dc105b2e-de69-45a5-9a9c-c89a27f42e42] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 65f0ce30-d9ca-4c16-b536-acd92f5f41ce.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-dc105b2e-de69-45a5-9a9c-c89a27f42e42"}]}#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.608 187256 DEBUG nova.scheduler.client.report [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.627 187256 DEBUG nova.scheduler.client.report [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.627 187256 DEBUG nova.compute.provider_tree [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.650 187256 DEBUG nova.scheduler.client.report [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.675 187256 DEBUG nova.scheduler.client.report [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.784 187256 DEBUG nova.compute.provider_tree [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.869 187256 DEBUG nova.scheduler.client.report [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Updated inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.870 187256 DEBUG nova.compute.provider_tree [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Updating resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.870 187256 DEBUG nova.compute.provider_tree [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.900 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.901 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.967 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:19:01 np0005538960 nova_compute[187252]: 2025-11-28 16:19:01.968 187256 DEBUG nova.network.neutron [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.004 187256 INFO nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.055 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.261 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.263 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.263 187256 INFO nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Creating image(s)#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.264 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "/var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.264 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "/var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.265 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "/var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.281 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.345 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.358 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.359 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.360 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.373 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.440 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.441 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.482 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.483 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.484 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.547 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.548 187256 DEBUG nova.virt.disk.api [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Checking if we can resize image /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.549 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.605 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.608 187256 DEBUG nova.virt.disk.api [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Cannot resize image /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.608 187256 DEBUG nova.objects.instance [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lazy-loading 'migration_context' on Instance uuid 355865cd-5e10-47f6-9353-89df93d33afb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.630 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.631 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Ensure instance console log exists: /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.634 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.635 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:02 np0005538960 nova_compute[187252]: 2025-11-28 16:19:02.635 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:04 np0005538960 nova_compute[187252]: 2025-11-28 16:19:04.533 187256 DEBUG nova.network.neutron [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Successfully created port: da59e250-e8c3-4aa3-a762-8c4ce1941548 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:19:05 np0005538960 podman[214506]: 2025-11-28 16:19:05.211338148 +0000 UTC m=+0.100939573 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, distribution-scope=public, release=1755695350)
Nov 28 11:19:05 np0005538960 nova_compute[187252]: 2025-11-28 16:19:05.613 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:05 np0005538960 NetworkManager[55548]: <info>  [1764346745.6136] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/25)
Nov 28 11:19:05 np0005538960 NetworkManager[55548]: <info>  [1764346745.6143] device (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 11:19:05 np0005538960 NetworkManager[55548]: <info>  [1764346745.6152] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/26)
Nov 28 11:19:05 np0005538960 NetworkManager[55548]: <info>  [1764346745.6155] device (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Nov 28 11:19:05 np0005538960 NetworkManager[55548]: <info>  [1764346745.6162] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Nov 28 11:19:05 np0005538960 NetworkManager[55548]: <info>  [1764346745.6167] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Nov 28 11:19:05 np0005538960 NetworkManager[55548]: <info>  [1764346745.6171] device (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 28 11:19:05 np0005538960 NetworkManager[55548]: <info>  [1764346745.6174] device (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Nov 28 11:19:05 np0005538960 nova_compute[187252]: 2025-11-28 16:19:05.738 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:05 np0005538960 nova_compute[187252]: 2025-11-28 16:19:05.861 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:05 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:05Z|00032|binding|INFO|Releasing lport 8da911b6-4b06-444b-b895-eebe136e2189 from this chassis (sb_readonly=0)
Nov 28 11:19:05 np0005538960 nova_compute[187252]: 2025-11-28 16:19:05.900 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:06Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f3:00:db 10.100.0.14
Nov 28 11:19:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:06Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f3:00:db 10.100.0.14
Nov 28 11:19:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:06.338 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:06.339 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:06.339 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:07 np0005538960 nova_compute[187252]: 2025-11-28 16:19:07.347 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:10 np0005538960 nova_compute[187252]: 2025-11-28 16:19:10.743 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:11 np0005538960 nova_compute[187252]: 2025-11-28 16:19:11.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:12 np0005538960 nova_compute[187252]: 2025-11-28 16:19:12.352 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.194 187256 DEBUG nova.network.neutron [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Successfully updated port: da59e250-e8c3-4aa3-a762-8c4ce1941548 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.215 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "refresh_cache-355865cd-5e10-47f6-9353-89df93d33afb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.215 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquired lock "refresh_cache-355865cd-5e10-47f6-9353-89df93d33afb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.216 187256 DEBUG nova.network.neutron [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:19:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:13Z|00033|memory|INFO|peak resident set size grew 56% in last 1065.0 seconds, from 16000 kB to 24960 kB
Nov 28 11:19:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:13Z|00034|memory|INFO|idl-cells-OVN_Southbound:11350 idl-cells-Open_vSwitch:813 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:372 lflow-cache-entries-cache-matches:293 lflow-cache-size-KB:1532 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:689 ofctrl_installed_flow_usage-KB:505 ofctrl_sb_flow_ref_usage-KB:255
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.340 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.366 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.368 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.368 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.368 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.503 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:13 np0005538960 podman[214532]: 2025-11-28 16:19:13.518883986 +0000 UTC m=+0.098576647 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.564 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.565 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.614 187256 DEBUG nova.network.neutron [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.623 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.663 187256 INFO nova.compute.manager [None req-c45ea313-f316-4fa4-b338-281c2f128ac0 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Get console output#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.766 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.804 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.805 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5584MB free_disk=73.31568145751953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.806 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.806 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.919 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance eff28834-4c5b-46d0-90a8-4be63b9fff80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.919 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance 355865cd-5e10-47f6-9353-89df93d33afb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.919 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:19:13 np0005538960 nova_compute[187252]: 2025-11-28 16:19:13.920 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:19:14 np0005538960 nova_compute[187252]: 2025-11-28 16:19:14.057 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:19:14 np0005538960 nova_compute[187252]: 2025-11-28 16:19:14.080 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:19:14 np0005538960 nova_compute[187252]: 2025-11-28 16:19:14.109 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:19:14 np0005538960 nova_compute[187252]: 2025-11-28 16:19:14.109 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.084 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.085 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.086 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.086 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.297 187256 DEBUG nova.network.neutron [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Updating instance_info_cache with network_info: [{"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.317 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.321 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Releasing lock "refresh_cache-355865cd-5e10-47f6-9353-89df93d33afb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.321 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Instance network_info: |[{"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.326 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Start _get_guest_xml network_info=[{"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.337 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.339 187256 WARNING nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.344 187256 DEBUG nova.virt.libvirt.host [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.345 187256 DEBUG nova.virt.libvirt.host [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.353 187256 DEBUG nova.virt.libvirt.host [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.354 187256 DEBUG nova.virt.libvirt.host [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.356 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.356 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.356 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.356 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.357 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.357 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.357 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.357 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.357 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.357 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.358 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.358 187256 DEBUG nova.virt.hardware [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.361 187256 DEBUG nova.virt.libvirt.vif [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:18:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1425157105',display_name='tempest-TestServerMultinode-server-1425157105',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1425157105',id=6,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='752d0a937cab4d35aabf297fb7442543',ramdisk_id='',reservation_id='r-qms8tkvf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-718990049',owner_user_name='tempest-TestServerMultinode-718990049-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:19:02Z,user_data=None,user_id='9552307b9b4a4494a17ee2eddce9f8ac',uuid=355865cd-5e10-47f6-9353-89df93d33afb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.362 187256 DEBUG nova.network.os_vif_util [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Converting VIF {"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.362 187256 DEBUG nova.network.os_vif_util [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:74:58,bridge_name='br-int',has_traffic_filtering=True,id=da59e250-e8c3-4aa3-a762-8c4ce1941548,network=Network(c85ae182-6cfd-451f-a7ec-421bd1586236),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda59e250-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.363 187256 DEBUG nova.objects.instance [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lazy-loading 'pci_devices' on Instance uuid 355865cd-5e10-47f6-9353-89df93d33afb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.385 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <uuid>355865cd-5e10-47f6-9353-89df93d33afb</uuid>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <name>instance-00000006</name>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestServerMultinode-server-1425157105</nova:name>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:19:15</nova:creationTime>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        <nova:user uuid="9552307b9b4a4494a17ee2eddce9f8ac">tempest-TestServerMultinode-718990049-project-admin</nova:user>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        <nova:project uuid="752d0a937cab4d35aabf297fb7442543">tempest-TestServerMultinode-718990049</nova:project>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        <nova:port uuid="da59e250-e8c3-4aa3-a762-8c4ce1941548">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <entry name="serial">355865cd-5e10-47f6-9353-89df93d33afb</entry>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <entry name="uuid">355865cd-5e10-47f6-9353-89df93d33afb</entry>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk.config"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:a8:74:58"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <target dev="tapda59e250-e8"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/console.log" append="off"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:19:15 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:19:15 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:19:15 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:19:15 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.385 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Preparing to wait for external event network-vif-plugged-da59e250-e8c3-4aa3-a762-8c4ce1941548 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.385 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "355865cd-5e10-47f6-9353-89df93d33afb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.386 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.386 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.386 187256 DEBUG nova.virt.libvirt.vif [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:18:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1425157105',display_name='tempest-TestServerMultinode-server-1425157105',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1425157105',id=6,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='752d0a937cab4d35aabf297fb7442543',ramdisk_id='',reservation_id='r-qms8tkvf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-718990049',owner_user_name='tempest-TestServerMultinode-718990049-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:19:02Z,user_data=None,user_id='9552307b9b4a4494a17ee2eddce9f8ac',uuid=355865cd-5e10-47f6-9353-89df93d33afb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.387 187256 DEBUG nova.network.os_vif_util [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Converting VIF {"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.387 187256 DEBUG nova.network.os_vif_util [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:74:58,bridge_name='br-int',has_traffic_filtering=True,id=da59e250-e8c3-4aa3-a762-8c4ce1941548,network=Network(c85ae182-6cfd-451f-a7ec-421bd1586236),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda59e250-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.387 187256 DEBUG os_vif [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:74:58,bridge_name='br-int',has_traffic_filtering=True,id=da59e250-e8c3-4aa3-a762-8c4ce1941548,network=Network(c85ae182-6cfd-451f-a7ec-421bd1586236),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda59e250-e8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.388 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.388 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.389 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.392 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.393 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda59e250-e8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.393 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapda59e250-e8, col_values=(('external_ids', {'iface-id': 'da59e250-e8c3-4aa3-a762-8c4ce1941548', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:74:58', 'vm-uuid': '355865cd-5e10-47f6-9353-89df93d33afb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.395 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:15 np0005538960 NetworkManager[55548]: <info>  [1764346755.3968] manager: (tapda59e250-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.397 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.403 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.404 187256 INFO os_vif [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:74:58,bridge_name='br-int',has_traffic_filtering=True,id=da59e250-e8c3-4aa3-a762-8c4ce1941548,network=Network(c85ae182-6cfd-451f-a7ec-421bd1586236),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda59e250-e8')#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.468 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.468 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.469 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] No VIF found with MAC fa:16:3e:a8:74:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.469 187256 INFO nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Using config drive#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.630 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.630 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.631 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.631 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid eff28834-4c5b-46d0-90a8-4be63b9fff80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.986 187256 INFO nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Creating config drive at /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk.config#033[00m
Nov 28 11:19:15 np0005538960 nova_compute[187252]: 2025-11-28 16:19:15.992 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5_bmv562 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.118 187256 DEBUG oslo_concurrency.processutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5_bmv562" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:16 np0005538960 kernel: tapda59e250-e8: entered promiscuous mode
Nov 28 11:19:16 np0005538960 NetworkManager[55548]: <info>  [1764346756.1828] manager: (tapda59e250-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/30)
Nov 28 11:19:16 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:16Z|00035|binding|INFO|Claiming lport da59e250-e8c3-4aa3-a762-8c4ce1941548 for this chassis.
Nov 28 11:19:16 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:16Z|00036|binding|INFO|da59e250-e8c3-4aa3-a762-8c4ce1941548: Claiming fa:16:3e:a8:74:58 10.100.0.8
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.184 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:16 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:16Z|00037|binding|INFO|Setting lport da59e250-e8c3-4aa3-a762-8c4ce1941548 ovn-installed in OVS
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.200 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.203 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:16 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:16Z|00038|binding|INFO|Setting lport da59e250-e8c3-4aa3-a762-8c4ce1941548 up in Southbound
Nov 28 11:19:16 np0005538960 systemd-udevd[214581]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.227 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:74:58 10.100.0.8'], port_security=['fa:16:3e:a8:74:58 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '355865cd-5e10-47f6-9353-89df93d33afb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c85ae182-6cfd-451f-a7ec-421bd1586236', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '752d0a937cab4d35aabf297fb7442543', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0fff1aaf-06b5-462b-b0fb-38684dbb1340', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ed2682-909b-49cd-b50f-f638e52c5e89, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=da59e250-e8c3-4aa3-a762-8c4ce1941548) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.228 104369 INFO neutron.agent.ovn.metadata.agent [-] Port da59e250-e8c3-4aa3-a762-8c4ce1941548 in datapath c85ae182-6cfd-451f-a7ec-421bd1586236 bound to our chassis#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.231 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c85ae182-6cfd-451f-a7ec-421bd1586236#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.244 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7dff8cf5-f419-4ba5-895a-128b25ca40ec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.245 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc85ae182-61 in ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.248 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc85ae182-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.249 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[44eb04ff-1ce3-4e51-9682-4f2898aa03d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 NetworkManager[55548]: <info>  [1764346756.2504] device (tapda59e250-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:19:16 np0005538960 NetworkManager[55548]: <info>  [1764346756.2512] device (tapda59e250-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.251 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5652c29b-f98b-4307-8b15-156dd58765cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 systemd-machined[153518]: New machine qemu-2-instance-00000006.
Nov 28 11:19:16 np0005538960 systemd[1]: Started Virtual Machine qemu-2-instance-00000006.
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.274 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[fd26bee0-a6b2-4473-8884-b946077c2354]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.291 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[beec5e59-0300-4ae1-92e3-51b8161c5d41]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 podman[214575]: 2025-11-28 16:19:16.295947196 +0000 UTC m=+0.074325910 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.321 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[8596a1a8-a6a6-47e3-b437-3e3fbc87e692]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.327 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[04278751-e216-4233-af4f-5923b9e9112f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 NetworkManager[55548]: <info>  [1764346756.3289] manager: (tapc85ae182-60): new Veth device (/org/freedesktop/NetworkManager/Devices/31)
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.365 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[b3821fec-db02-47a6-b8ca-d7b0482e8cde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.368 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[2694d28a-eaaf-421b-ba85-65dec7d33efb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 NetworkManager[55548]: <info>  [1764346756.3952] device (tapc85ae182-60): carrier: link connected
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.402 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[a47520e3-ac6f-4933-bb62-d7ecdbc89848]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.425 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[586d8b1c-046e-4010-8896-b01d00f56949]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc85ae182-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:ba:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385597, 'reachable_time': 35561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214638, 'error': None, 'target': 'ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.445 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f43cf3dc-c164-489b-9d83-4afecf8e9418]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feac:babd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 385597, 'tstamp': 385597}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214639, 'error': None, 'target': 'ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.474 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[eb1e6d78-b1f8-4448-b080-2dd6778440e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc85ae182-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:ba:bd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385597, 'reachable_time': 35561, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214640, 'error': None, 'target': 'ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.508 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[eceeee8d-59d5-468f-af2d-56a60064c89c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.574 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[0f83d8be-b867-47bb-9bde-b325d68cf50b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.576 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc85ae182-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.576 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.577 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc85ae182-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:16 np0005538960 kernel: tapc85ae182-60: entered promiscuous mode
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.579 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:16 np0005538960 NetworkManager[55548]: <info>  [1764346756.5805] manager: (tapc85ae182-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.581 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc85ae182-60, col_values=(('external_ids', {'iface-id': '28ee45ac-723d-4a3b-8a32-5cc4f8a0cac0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.582 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:16 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:16Z|00039|binding|INFO|Releasing lport 28ee45ac-723d-4a3b-8a32-5cc4f8a0cac0 from this chassis (sb_readonly=0)
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.594 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.596 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c85ae182-6cfd-451f-a7ec-421bd1586236.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c85ae182-6cfd-451f-a7ec-421bd1586236.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.598 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[559dea44-46bf-404f-9aa8-a583b0f9bdd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.599 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-c85ae182-6cfd-451f-a7ec-421bd1586236
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/c85ae182-6cfd-451f-a7ec-421bd1586236.pid.haproxy
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID c85ae182-6cfd-451f-a7ec-421bd1586236
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:19:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:16.600 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236', 'env', 'PROCESS_TAG=haproxy-c85ae182-6cfd-451f-a7ec-421bd1586236', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c85ae182-6cfd-451f-a7ec-421bd1586236.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.855 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346756.8546968, 355865cd-5e10-47f6-9353-89df93d33afb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.856 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] VM Started (Lifecycle Event)#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.875 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.880 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346756.8592744, 355865cd-5e10-47f6-9353-89df93d33afb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.880 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.895 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.899 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:19:16 np0005538960 nova_compute[187252]: 2025-11-28 16:19:16.914 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:19:17 np0005538960 podman[214679]: 2025-11-28 16:19:17.025973281 +0000 UTC m=+0.081977635 container create d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:19:17 np0005538960 systemd[1]: Started libpod-conmon-d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b.scope.
Nov 28 11:19:17 np0005538960 podman[214679]: 2025-11-28 16:19:16.990555444 +0000 UTC m=+0.046559798 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:19:17 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:19:17 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1169710d0081143b0887cb9d8614102324a37a04cfcf45a77d2283f2ad438a8b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:19:17 np0005538960 podman[214679]: 2025-11-28 16:19:17.123687925 +0000 UTC m=+0.179692259 container init d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:19:17 np0005538960 podman[214679]: 2025-11-28 16:19:17.129753052 +0000 UTC m=+0.185757366 container start d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 11:19:17 np0005538960 neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236[214694]: [NOTICE]   (214698) : New worker (214700) forked
Nov 28 11:19:17 np0005538960 neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236[214694]: [NOTICE]   (214698) : Loading success.
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.355 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.590 187256 DEBUG nova.compute.manager [req-4e12ed47-3c33-4553-aa1d-776fe62495ae req-ae27f571-f62a-45a4-af7a-e9ae9da6f55c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Received event network-changed-da59e250-e8c3-4aa3-a762-8c4ce1941548 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.590 187256 DEBUG nova.compute.manager [req-4e12ed47-3c33-4553-aa1d-776fe62495ae req-ae27f571-f62a-45a4-af7a-e9ae9da6f55c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Refreshing instance network info cache due to event network-changed-da59e250-e8c3-4aa3-a762-8c4ce1941548. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.591 187256 DEBUG oslo_concurrency.lockutils [req-4e12ed47-3c33-4553-aa1d-776fe62495ae req-ae27f571-f62a-45a4-af7a-e9ae9da6f55c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-355865cd-5e10-47f6-9353-89df93d33afb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.591 187256 DEBUG oslo_concurrency.lockutils [req-4e12ed47-3c33-4553-aa1d-776fe62495ae req-ae27f571-f62a-45a4-af7a-e9ae9da6f55c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-355865cd-5e10-47f6-9353-89df93d33afb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.591 187256 DEBUG nova.network.neutron [req-4e12ed47-3c33-4553-aa1d-776fe62495ae req-ae27f571-f62a-45a4-af7a-e9ae9da6f55c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Refreshing network info cache for port da59e250-e8c3-4aa3-a762-8c4ce1941548 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.652 187256 DEBUG nova.compute.manager [req-0ad2e51f-32db-45c0-a4a0-fbd141812c77 req-03d48811-12c3-4744-a501-0c787a0a7727 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-changed-6b498512-32dd-4e59-95bd-71c3a69bb44f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.652 187256 DEBUG nova.compute.manager [req-0ad2e51f-32db-45c0-a4a0-fbd141812c77 req-03d48811-12c3-4744-a501-0c787a0a7727 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Refreshing instance network info cache due to event network-changed-6b498512-32dd-4e59-95bd-71c3a69bb44f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:19:17 np0005538960 nova_compute[187252]: 2025-11-28 16:19:17.653 187256 DEBUG oslo_concurrency.lockutils [req-0ad2e51f-32db-45c0-a4a0-fbd141812c77 req-03d48811-12c3-4744-a501-0c787a0a7727 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:19:19 np0005538960 nova_compute[187252]: 2025-11-28 16:19:19.071 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updating instance_info_cache with network_info: [{"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:19:19 np0005538960 nova_compute[187252]: 2025-11-28 16:19:19.113 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:19:19 np0005538960 nova_compute[187252]: 2025-11-28 16:19:19.114 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:19:19 np0005538960 nova_compute[187252]: 2025-11-28 16:19:19.114 187256 DEBUG oslo_concurrency.lockutils [req-0ad2e51f-32db-45c0-a4a0-fbd141812c77 req-03d48811-12c3-4744-a501-0c787a0a7727 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:19:19 np0005538960 nova_compute[187252]: 2025-11-28 16:19:19.114 187256 DEBUG nova.network.neutron [req-0ad2e51f-32db-45c0-a4a0-fbd141812c77 req-03d48811-12c3-4744-a501-0c787a0a7727 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Refreshing network info cache for port 6b498512-32dd-4e59-95bd-71c3a69bb44f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:19:19 np0005538960 nova_compute[187252]: 2025-11-28 16:19:19.116 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:19 np0005538960 nova_compute[187252]: 2025-11-28 16:19:19.116 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.293 187256 DEBUG nova.network.neutron [req-4e12ed47-3c33-4553-aa1d-776fe62495ae req-ae27f571-f62a-45a4-af7a-e9ae9da6f55c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Updated VIF entry in instance network info cache for port da59e250-e8c3-4aa3-a762-8c4ce1941548. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.294 187256 DEBUG nova.network.neutron [req-4e12ed47-3c33-4553-aa1d-776fe62495ae req-ae27f571-f62a-45a4-af7a-e9ae9da6f55c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Updating instance_info_cache with network_info: [{"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.317 187256 DEBUG oslo_concurrency.lockutils [req-4e12ed47-3c33-4553-aa1d-776fe62495ae req-ae27f571-f62a-45a4-af7a-e9ae9da6f55c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-355865cd-5e10-47f6-9353-89df93d33afb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.395 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.558 187256 DEBUG nova.compute.manager [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Received event network-vif-plugged-da59e250-e8c3-4aa3-a762-8c4ce1941548 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.559 187256 DEBUG oslo_concurrency.lockutils [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "355865cd-5e10-47f6-9353-89df93d33afb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.559 187256 DEBUG oslo_concurrency.lockutils [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.560 187256 DEBUG oslo_concurrency.lockutils [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.560 187256 DEBUG nova.compute.manager [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Processing event network-vif-plugged-da59e250-e8c3-4aa3-a762-8c4ce1941548 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.560 187256 DEBUG nova.compute.manager [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Received event network-vif-plugged-da59e250-e8c3-4aa3-a762-8c4ce1941548 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.560 187256 DEBUG oslo_concurrency.lockutils [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "355865cd-5e10-47f6-9353-89df93d33afb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.560 187256 DEBUG oslo_concurrency.lockutils [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.560 187256 DEBUG oslo_concurrency.lockutils [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.561 187256 DEBUG nova.compute.manager [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] No waiting events found dispatching network-vif-plugged-da59e250-e8c3-4aa3-a762-8c4ce1941548 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.561 187256 WARNING nova.compute.manager [req-583a330f-8f53-428b-b240-56ac6bc04f9a req-3c5e71cd-61aa-4531-88ac-fda9727c15c7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Received unexpected event network-vif-plugged-da59e250-e8c3-4aa3-a762-8c4ce1941548 for instance with vm_state building and task_state spawning.#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.562 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.567 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346760.5667443, 355865cd-5e10-47f6-9353-89df93d33afb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.567 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.569 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.573 187256 INFO nova.virt.libvirt.driver [-] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Instance spawned successfully.#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.574 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.595 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.607 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.610 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.611 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.611 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.612 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.612 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.612 187256 DEBUG nova.virt.libvirt.driver [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.653 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.716 187256 INFO nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Took 18.45 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.716 187256 DEBUG nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.800 187256 INFO nova.compute.manager [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Took 19.54 seconds to build instance.#033[00m
Nov 28 11:19:20 np0005538960 nova_compute[187252]: 2025-11-28 16:19:20.819 187256 DEBUG oslo_concurrency.lockutils [None req-4f058f4c-3c5a-4df1-80b9-6481362c1bb9 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:22 np0005538960 nova_compute[187252]: 2025-11-28 16:19:22.111 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:19:22 np0005538960 podman[214712]: 2025-11-28 16:19:22.224877504 +0000 UTC m=+0.122093705 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:19:22 np0005538960 nova_compute[187252]: 2025-11-28 16:19:22.357 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:23 np0005538960 nova_compute[187252]: 2025-11-28 16:19:23.834 187256 DEBUG nova.network.neutron [req-0ad2e51f-32db-45c0-a4a0-fbd141812c77 req-03d48811-12c3-4744-a501-0c787a0a7727 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updated VIF entry in instance network info cache for port 6b498512-32dd-4e59-95bd-71c3a69bb44f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:19:23 np0005538960 nova_compute[187252]: 2025-11-28 16:19:23.835 187256 DEBUG nova.network.neutron [req-0ad2e51f-32db-45c0-a4a0-fbd141812c77 req-03d48811-12c3-4744-a501-0c787a0a7727 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updating instance_info_cache with network_info: [{"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:19:23 np0005538960 nova_compute[187252]: 2025-11-28 16:19:23.863 187256 DEBUG oslo_concurrency.lockutils [req-0ad2e51f-32db-45c0-a4a0-fbd141812c77 req-03d48811-12c3-4744-a501-0c787a0a7727 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:19:25 np0005538960 nova_compute[187252]: 2025-11-28 16:19:25.399 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:26 np0005538960 podman[214740]: 2025-11-28 16:19:26.161568895 +0000 UTC m=+0.065622139 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 11:19:26 np0005538960 podman[214739]: 2025-11-28 16:19:26.191806547 +0000 UTC m=+0.096717451 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 11:19:27 np0005538960 nova_compute[187252]: 2025-11-28 16:19:27.359 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:29 np0005538960 nova_compute[187252]: 2025-11-28 16:19:29.570 187256 DEBUG nova.compute.manager [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.022 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.023 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.048 187256 DEBUG nova.objects.instance [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'pci_requests' on Instance uuid 8b80b6d9-b521-40f5-be13-ef8f7196f818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.069 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.070 187256 INFO nova.compute.claims [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.071 187256 DEBUG nova.objects.instance [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'resources' on Instance uuid 8b80b6d9-b521-40f5-be13-ef8f7196f818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.081 187256 DEBUG nova.objects.instance [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'numa_topology' on Instance uuid 8b80b6d9-b521-40f5-be13-ef8f7196f818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.090 187256 DEBUG nova.objects.instance [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 8b80b6d9-b521-40f5-be13-ef8f7196f818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.145 187256 INFO nova.compute.resource_tracker [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Updating resource usage from migration 748a2fbe-e01c-4cfa-940f-8a5d216d419c#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.146 187256 DEBUG nova.compute.resource_tracker [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Starting to track incoming migration 748a2fbe-e01c-4cfa-940f-8a5d216d419c with flavor c90217bd-1e89-4c68-8e01-33bf1cee456c _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.237 187256 DEBUG nova.compute.provider_tree [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.251 187256 DEBUG nova.scheduler.client.report [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.403 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.637 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.639 187256 INFO nova.compute.manager [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Migrating#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.639 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.639 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.645 187256 INFO nova.compute.rpcapi [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Nov 28 11:19:30 np0005538960 nova_compute[187252]: 2025-11-28 16:19:30.646 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:19:32 np0005538960 podman[214775]: 2025-11-28 16:19:32.165250833 +0000 UTC m=+0.068096619 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:19:32 np0005538960 nova_compute[187252]: 2025-11-28 16:19:32.362 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:35 np0005538960 nova_compute[187252]: 2025-11-28 16:19:35.406 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:35 np0005538960 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 11:19:35 np0005538960 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 11:19:35 np0005538960 systemd-logind[788]: New session 28 of user nova.
Nov 28 11:19:35 np0005538960 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 11:19:35 np0005538960 systemd[1]: Starting User Manager for UID 42436...
Nov 28 11:19:35 np0005538960 podman[214820]: 2025-11-28 16:19:35.505802239 +0000 UTC m=+0.080279624 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350)
Nov 28 11:19:35 np0005538960 systemd[214839]: Queued start job for default target Main User Target.
Nov 28 11:19:35 np0005538960 systemd[214839]: Created slice User Application Slice.
Nov 28 11:19:35 np0005538960 systemd[214839]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 11:19:35 np0005538960 systemd[214839]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 11:19:35 np0005538960 systemd[214839]: Reached target Paths.
Nov 28 11:19:35 np0005538960 systemd[214839]: Reached target Timers.
Nov 28 11:19:35 np0005538960 systemd[214839]: Starting D-Bus User Message Bus Socket...
Nov 28 11:19:35 np0005538960 systemd[214839]: Starting Create User's Volatile Files and Directories...
Nov 28 11:19:35 np0005538960 systemd[214839]: Listening on D-Bus User Message Bus Socket.
Nov 28 11:19:35 np0005538960 systemd[214839]: Reached target Sockets.
Nov 28 11:19:35 np0005538960 systemd[214839]: Finished Create User's Volatile Files and Directories.
Nov 28 11:19:35 np0005538960 systemd[214839]: Reached target Basic System.
Nov 28 11:19:35 np0005538960 systemd[1]: Started User Manager for UID 42436.
Nov 28 11:19:35 np0005538960 systemd[214839]: Reached target Main User Target.
Nov 28 11:19:35 np0005538960 systemd[214839]: Startup finished in 178ms.
Nov 28 11:19:35 np0005538960 systemd[1]: Started Session 28 of User nova.
Nov 28 11:19:35 np0005538960 systemd[1]: session-28.scope: Deactivated successfully.
Nov 28 11:19:35 np0005538960 systemd-logind[788]: Session 28 logged out. Waiting for processes to exit.
Nov 28 11:19:35 np0005538960 systemd-logind[788]: Removed session 28.
Nov 28 11:19:35 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:35Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a8:74:58 10.100.0.8
Nov 28 11:19:35 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:35Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:74:58 10.100.0.8
Nov 28 11:19:35 np0005538960 systemd-logind[788]: New session 30 of user nova.
Nov 28 11:19:35 np0005538960 systemd[1]: Started Session 30 of User nova.
Nov 28 11:19:36 np0005538960 systemd[1]: session-30.scope: Deactivated successfully.
Nov 28 11:19:36 np0005538960 systemd-logind[788]: Session 30 logged out. Waiting for processes to exit.
Nov 28 11:19:36 np0005538960 systemd-logind[788]: Removed session 30.
Nov 28 11:19:37 np0005538960 nova_compute[187252]: 2025-11-28 16:19:37.364 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:39 np0005538960 systemd-logind[788]: New session 31 of user nova.
Nov 28 11:19:39 np0005538960 systemd[1]: Started Session 31 of User nova.
Nov 28 11:19:40 np0005538960 nova_compute[187252]: 2025-11-28 16:19:40.410 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:40 np0005538960 systemd[1]: session-31.scope: Deactivated successfully.
Nov 28 11:19:40 np0005538960 systemd-logind[788]: Session 31 logged out. Waiting for processes to exit.
Nov 28 11:19:40 np0005538960 systemd-logind[788]: Removed session 31.
Nov 28 11:19:40 np0005538960 systemd-logind[788]: New session 32 of user nova.
Nov 28 11:19:40 np0005538960 systemd[1]: Started Session 32 of User nova.
Nov 28 11:19:40 np0005538960 systemd[1]: session-32.scope: Deactivated successfully.
Nov 28 11:19:40 np0005538960 systemd-logind[788]: Session 32 logged out. Waiting for processes to exit.
Nov 28 11:19:40 np0005538960 systemd-logind[788]: Removed session 32.
Nov 28 11:19:40 np0005538960 systemd-logind[788]: New session 33 of user nova.
Nov 28 11:19:40 np0005538960 systemd[1]: Started Session 33 of User nova.
Nov 28 11:19:41 np0005538960 systemd[1]: session-33.scope: Deactivated successfully.
Nov 28 11:19:41 np0005538960 systemd-logind[788]: Session 33 logged out. Waiting for processes to exit.
Nov 28 11:19:41 np0005538960 systemd-logind[788]: Removed session 33.
Nov 28 11:19:42 np0005538960 nova_compute[187252]: 2025-11-28 16:19:42.367 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:43 np0005538960 nova_compute[187252]: 2025-11-28 16:19:43.337 187256 DEBUG nova.compute.manager [req-11cbf669-70b0-4781-9a27-8ce34651351b req-1827c2ea-56cd-4ff3-acc5-de9c68a35cac 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-vif-unplugged-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:19:43 np0005538960 nova_compute[187252]: 2025-11-28 16:19:43.339 187256 DEBUG oslo_concurrency.lockutils [req-11cbf669-70b0-4781-9a27-8ce34651351b req-1827c2ea-56cd-4ff3-acc5-de9c68a35cac 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:43 np0005538960 nova_compute[187252]: 2025-11-28 16:19:43.339 187256 DEBUG oslo_concurrency.lockutils [req-11cbf669-70b0-4781-9a27-8ce34651351b req-1827c2ea-56cd-4ff3-acc5-de9c68a35cac 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:43 np0005538960 nova_compute[187252]: 2025-11-28 16:19:43.339 187256 DEBUG oslo_concurrency.lockutils [req-11cbf669-70b0-4781-9a27-8ce34651351b req-1827c2ea-56cd-4ff3-acc5-de9c68a35cac 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:43 np0005538960 nova_compute[187252]: 2025-11-28 16:19:43.339 187256 DEBUG nova.compute.manager [req-11cbf669-70b0-4781-9a27-8ce34651351b req-1827c2ea-56cd-4ff3-acc5-de9c68a35cac 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] No waiting events found dispatching network-vif-unplugged-9d1a652f-be5e-4b1b-b759-f17b3181308b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:19:43 np0005538960 nova_compute[187252]: 2025-11-28 16:19:43.340 187256 WARNING nova.compute.manager [req-11cbf669-70b0-4781-9a27-8ce34651351b req-1827c2ea-56cd-4ff3-acc5-de9c68a35cac 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received unexpected event network-vif-unplugged-9d1a652f-be5e-4b1b-b759-f17b3181308b for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 28 11:19:43 np0005538960 nova_compute[187252]: 2025-11-28 16:19:43.554 187256 INFO nova.network.neutron [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Updating port 9d1a652f-be5e-4b1b-b759-f17b3181308b with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 28 11:19:44 np0005538960 podman[214882]: 2025-11-28 16:19:44.227637051 +0000 UTC m=+0.113947198 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=edpm, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:19:45 np0005538960 nova_compute[187252]: 2025-11-28 16:19:45.413 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:47 np0005538960 podman[214904]: 2025-11-28 16:19:47.154872315 +0000 UTC m=+0.060672400 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:19:47 np0005538960 nova_compute[187252]: 2025-11-28 16:19:47.369 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:47 np0005538960 nova_compute[187252]: 2025-11-28 16:19:47.517 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:47 np0005538960 nova_compute[187252]: 2025-11-28 16:19:47.727 187256 DEBUG nova.compute.manager [req-e8cc3107-4085-4661-b602-6d8ea7ce2c74 req-a3735523-fa73-4436-b41c-1fbb0dc19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:19:47 np0005538960 nova_compute[187252]: 2025-11-28 16:19:47.728 187256 DEBUG oslo_concurrency.lockutils [req-e8cc3107-4085-4661-b602-6d8ea7ce2c74 req-a3735523-fa73-4436-b41c-1fbb0dc19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:47 np0005538960 nova_compute[187252]: 2025-11-28 16:19:47.728 187256 DEBUG oslo_concurrency.lockutils [req-e8cc3107-4085-4661-b602-6d8ea7ce2c74 req-a3735523-fa73-4436-b41c-1fbb0dc19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:47 np0005538960 nova_compute[187252]: 2025-11-28 16:19:47.729 187256 DEBUG oslo_concurrency.lockutils [req-e8cc3107-4085-4661-b602-6d8ea7ce2c74 req-a3735523-fa73-4436-b41c-1fbb0dc19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:47 np0005538960 nova_compute[187252]: 2025-11-28 16:19:47.729 187256 DEBUG nova.compute.manager [req-e8cc3107-4085-4661-b602-6d8ea7ce2c74 req-a3735523-fa73-4436-b41c-1fbb0dc19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] No waiting events found dispatching network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:19:47 np0005538960 nova_compute[187252]: 2025-11-28 16:19:47.730 187256 WARNING nova.compute.manager [req-e8cc3107-4085-4661-b602-6d8ea7ce2c74 req-a3735523-fa73-4436-b41c-1fbb0dc19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received unexpected event network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.518 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "355865cd-5e10-47f6-9353-89df93d33afb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.519 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.519 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "355865cd-5e10-47f6-9353-89df93d33afb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.519 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.519 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.520 187256 INFO nova.compute.manager [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Terminating instance#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.521 187256 DEBUG nova.compute.manager [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:19:49 np0005538960 kernel: tapda59e250-e8 (unregistering): left promiscuous mode
Nov 28 11:19:49 np0005538960 NetworkManager[55548]: <info>  [1764346789.5580] device (tapda59e250-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.572 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:49 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:49Z|00040|binding|INFO|Releasing lport da59e250-e8c3-4aa3-a762-8c4ce1941548 from this chassis (sb_readonly=0)
Nov 28 11:19:49 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:49Z|00041|binding|INFO|Setting lport da59e250-e8c3-4aa3-a762-8c4ce1941548 down in Southbound
Nov 28 11:19:49 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:49Z|00042|binding|INFO|Removing iface tapda59e250-e8 ovn-installed in OVS
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.575 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:49 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:49.582 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:74:58 10.100.0.8'], port_security=['fa:16:3e:a8:74:58 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '355865cd-5e10-47f6-9353-89df93d33afb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c85ae182-6cfd-451f-a7ec-421bd1586236', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '752d0a937cab4d35aabf297fb7442543', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0fff1aaf-06b5-462b-b0fb-38684dbb1340', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c9ed2682-909b-49cd-b50f-f638e52c5e89, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=da59e250-e8c3-4aa3-a762-8c4ce1941548) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:19:49 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:49.583 104369 INFO neutron.agent.ovn.metadata.agent [-] Port da59e250-e8c3-4aa3-a762-8c4ce1941548 in datapath c85ae182-6cfd-451f-a7ec-421bd1586236 unbound from our chassis#033[00m
Nov 28 11:19:49 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:49.585 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c85ae182-6cfd-451f-a7ec-421bd1586236, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:19:49 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:49.586 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e597ef65-1d15-4663-92bb-ea80799f29b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:49 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:49.586 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236 namespace which is not needed anymore#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.590 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.623 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.624 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquired lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.624 187256 DEBUG nova.network.neutron [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:19:49 np0005538960 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Deactivated successfully.
Nov 28 11:19:49 np0005538960 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000006.scope: Consumed 15.717s CPU time.
Nov 28 11:19:49 np0005538960 systemd-machined[153518]: Machine qemu-2-instance-00000006 terminated.
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.784 187256 INFO nova.virt.libvirt.driver [-] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Instance destroyed successfully.#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.784 187256 DEBUG nova.objects.instance [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lazy-loading 'resources' on Instance uuid 355865cd-5e10-47f6-9353-89df93d33afb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.796 187256 DEBUG nova.virt.libvirt.vif [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:18:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1425157105',display_name='tempest-TestServerMultinode-server-1425157105',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testservermultinode-server-1425157105',id=6,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:19:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='752d0a937cab4d35aabf297fb7442543',ramdisk_id='',reservation_id='r-qms8tkvf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-718990049',owner_user_name='tempest-TestServerMultinode-718990049-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:19:20Z,user_data=None,user_id='9552307b9b4a4494a17ee2eddce9f8ac',uuid=355865cd-5e10-47f6-9353-89df93d33afb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.796 187256 DEBUG nova.network.os_vif_util [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Converting VIF {"id": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "address": "fa:16:3e:a8:74:58", "network": {"id": "c85ae182-6cfd-451f-a7ec-421bd1586236", "bridge": "br-int", "label": "tempest-TestServerMultinode-781725514-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "cff8db40a0354f34ad116d29c11a0b49", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapda59e250-e8", "ovs_interfaceid": "da59e250-e8c3-4aa3-a762-8c4ce1941548", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.797 187256 DEBUG nova.network.os_vif_util [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a8:74:58,bridge_name='br-int',has_traffic_filtering=True,id=da59e250-e8c3-4aa3-a762-8c4ce1941548,network=Network(c85ae182-6cfd-451f-a7ec-421bd1586236),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda59e250-e8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.797 187256 DEBUG os_vif [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a8:74:58,bridge_name='br-int',has_traffic_filtering=True,id=da59e250-e8c3-4aa3-a762-8c4ce1941548,network=Network(c85ae182-6cfd-451f-a7ec-421bd1586236),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda59e250-e8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.799 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.799 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda59e250-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.800 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.802 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.804 187256 INFO os_vif [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a8:74:58,bridge_name='br-int',has_traffic_filtering=True,id=da59e250-e8c3-4aa3-a762-8c4ce1941548,network=Network(c85ae182-6cfd-451f-a7ec-421bd1586236),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapda59e250-e8')#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.805 187256 INFO nova.virt.libvirt.driver [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Deleting instance files /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb_del#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.806 187256 INFO nova.virt.libvirt.driver [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Deletion of /var/lib/nova/instances/355865cd-5e10-47f6-9353-89df93d33afb_del complete#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.862 187256 DEBUG nova.virt.libvirt.host [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.862 187256 INFO nova.virt.libvirt.host [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] UEFI support detected#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.865 187256 INFO nova.compute.manager [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.866 187256 DEBUG oslo.service.loopingcall [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.866 187256 DEBUG nova.compute.manager [-] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:19:49 np0005538960 nova_compute[187252]: 2025-11-28 16:19:49.867 187256 DEBUG nova.network.neutron [-] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:19:50 np0005538960 neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236[214694]: [NOTICE]   (214698) : haproxy version is 2.8.14-c23fe91
Nov 28 11:19:50 np0005538960 neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236[214694]: [NOTICE]   (214698) : path to executable is /usr/sbin/haproxy
Nov 28 11:19:50 np0005538960 neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236[214694]: [WARNING]  (214698) : Exiting Master process...
Nov 28 11:19:50 np0005538960 neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236[214694]: [ALERT]    (214698) : Current worker (214700) exited with code 143 (Terminated)
Nov 28 11:19:50 np0005538960 neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236[214694]: [WARNING]  (214698) : All workers exited. Exiting... (0)
Nov 28 11:19:50 np0005538960 systemd[1]: libpod-d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b.scope: Deactivated successfully.
Nov 28 11:19:50 np0005538960 podman[214951]: 2025-11-28 16:19:50.07226408 +0000 UTC m=+0.347686754 container died d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:19:50 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b-userdata-shm.mount: Deactivated successfully.
Nov 28 11:19:50 np0005538960 systemd[1]: var-lib-containers-storage-overlay-1169710d0081143b0887cb9d8614102324a37a04cfcf45a77d2283f2ad438a8b-merged.mount: Deactivated successfully.
Nov 28 11:19:50 np0005538960 podman[214951]: 2025-11-28 16:19:50.211165282 +0000 UTC m=+0.486587956 container cleanup d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 11:19:50 np0005538960 systemd[1]: libpod-conmon-d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b.scope: Deactivated successfully.
Nov 28 11:19:50 np0005538960 podman[214997]: 2025-11-28 16:19:50.288691127 +0000 UTC m=+0.053678249 container remove d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.295 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[29c65680-9b3a-4d82-8553-57782ea9c1cc]: (4, ('Fri Nov 28 04:19:49 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236 (d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b)\nd5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b\nFri Nov 28 04:19:50 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236 (d5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b)\nd5aaa8394e4a185b5cf1fe3175c4c4672964d123ee3cfbfd789c390ddd75190b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.298 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2edd7c-5536-4be3-9b44-04cd15895157]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.299 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc85ae182-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:50 np0005538960 kernel: tapc85ae182-60: left promiscuous mode
Nov 28 11:19:50 np0005538960 nova_compute[187252]: 2025-11-28 16:19:50.302 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:50 np0005538960 nova_compute[187252]: 2025-11-28 16:19:50.314 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.318 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[201c26b5-3724-4633-a059-d4be1bfeb5ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.332 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[50f61c81-8caf-4fa9-b6e8-d4884c5ab9e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.334 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b69ae3-fe42-47cc-b7da-e20ebd4957e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.355 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[56b29bc1-17eb-401b-80ea-3a74ea7fec3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 385589, 'reachable_time': 25822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215011, 'error': None, 'target': 'ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:50 np0005538960 systemd[1]: run-netns-ovnmeta\x2dc85ae182\x2d6cfd\x2d451f\x2da7ec\x2d421bd1586236.mount: Deactivated successfully.
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.366 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c85ae182-6cfd-451f-a7ec-421bd1586236 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:19:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:50.368 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4797fb-decd-4f3c-8540-1b56c8d2d21a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:50 np0005538960 nova_compute[187252]: 2025-11-28 16:19:50.640 187256 DEBUG nova.compute.manager [req-e2beca32-95c3-40bd-a548-a0475ce30894 req-efeb21eb-4a19-4fed-9c5a-24dd978966df 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-changed-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:19:50 np0005538960 nova_compute[187252]: 2025-11-28 16:19:50.641 187256 DEBUG nova.compute.manager [req-e2beca32-95c3-40bd-a548-a0475ce30894 req-efeb21eb-4a19-4fed-9c5a-24dd978966df 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Refreshing instance network info cache due to event network-changed-9d1a652f-be5e-4b1b-b759-f17b3181308b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:19:50 np0005538960 nova_compute[187252]: 2025-11-28 16:19:50.641 187256 DEBUG oslo_concurrency.lockutils [req-e2beca32-95c3-40bd-a548-a0475ce30894 req-efeb21eb-4a19-4fed-9c5a-24dd978966df 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.287 187256 DEBUG nova.network.neutron [-] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.323 187256 INFO nova.compute.manager [-] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Took 1.46 seconds to deallocate network for instance.#033[00m
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.374 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.374 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.503 187256 DEBUG nova.compute.provider_tree [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.537 187256 DEBUG nova.scheduler.client.report [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:19:51 np0005538960 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 11:19:51 np0005538960 systemd[214839]: Activating special unit Exit the Session...
Nov 28 11:19:51 np0005538960 systemd[214839]: Stopped target Main User Target.
Nov 28 11:19:51 np0005538960 systemd[214839]: Stopped target Basic System.
Nov 28 11:19:51 np0005538960 systemd[214839]: Stopped target Paths.
Nov 28 11:19:51 np0005538960 systemd[214839]: Stopped target Sockets.
Nov 28 11:19:51 np0005538960 systemd[214839]: Stopped target Timers.
Nov 28 11:19:51 np0005538960 systemd[214839]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 11:19:51 np0005538960 systemd[214839]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 11:19:51 np0005538960 systemd[214839]: Closed D-Bus User Message Bus Socket.
Nov 28 11:19:51 np0005538960 systemd[214839]: Stopped Create User's Volatile Files and Directories.
Nov 28 11:19:51 np0005538960 systemd[214839]: Removed slice User Application Slice.
Nov 28 11:19:51 np0005538960 systemd[214839]: Reached target Shutdown.
Nov 28 11:19:51 np0005538960 systemd[214839]: Finished Exit the Session.
Nov 28 11:19:51 np0005538960 systemd[214839]: Reached target Exit the Session.
Nov 28 11:19:51 np0005538960 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 11:19:51 np0005538960 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 11:19:51 np0005538960 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 11:19:51 np0005538960 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 11:19:51 np0005538960 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 11:19:51 np0005538960 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 11:19:51 np0005538960 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.661 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.287s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.817 187256 INFO nova.scheduler.client.report [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Deleted allocations for instance 355865cd-5e10-47f6-9353-89df93d33afb#033[00m
Nov 28 11:19:51 np0005538960 nova_compute[187252]: 2025-11-28 16:19:51.956 187256 DEBUG oslo_concurrency.lockutils [None req-d646371c-b977-45c3-8eec-bc680bd2c368 9552307b9b4a4494a17ee2eddce9f8ac 752d0a937cab4d35aabf297fb7442543 - - default default] Lock "355865cd-5e10-47f6-9353-89df93d33afb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.437s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.371 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.663 187256 DEBUG nova.network.neutron [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Updating instance_info_cache with network_info: [{"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.689 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Releasing lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.693 187256 DEBUG oslo_concurrency.lockutils [req-e2beca32-95c3-40bd-a548-a0475ce30894 req-efeb21eb-4a19-4fed-9c5a-24dd978966df 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.693 187256 DEBUG nova.network.neutron [req-e2beca32-95c3-40bd-a548-a0475ce30894 req-efeb21eb-4a19-4fed-9c5a-24dd978966df 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Refreshing network info cache for port 9d1a652f-be5e-4b1b-b759-f17b3181308b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.725 187256 DEBUG nova.compute.manager [req-41dc14cf-f48f-4d64-a790-f47f33ccab01 req-041a0df1-124d-4afc-bee2-03d9e4ab2a53 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Received event network-vif-deleted-da59e250-e8c3-4aa3-a762-8c4ce1941548 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.779 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.781 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.781 187256 INFO nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Creating image(s)#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.782 187256 DEBUG nova.objects.instance [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8b80b6d9-b521-40f5-be13-ef8f7196f818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.803 187256 DEBUG oslo_concurrency.processutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.868 187256 DEBUG oslo_concurrency.processutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.870 187256 DEBUG nova.virt.disk.api [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Checking if we can resize image /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.871 187256 DEBUG oslo_concurrency.processutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.943 187256 DEBUG oslo_concurrency.processutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.946 187256 DEBUG nova.virt.disk.api [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Cannot resize image /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.965 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.966 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Ensure instance console log exists: /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.967 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.967 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.967 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.970 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Start _get_guest_xml network_info=[{"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1328571052", "vif_mac": "fa:16:3e:f3:98:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.975 187256 WARNING nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.979 187256 DEBUG nova.virt.libvirt.host [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.979 187256 DEBUG nova.virt.libvirt.host [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.986 187256 DEBUG nova.virt.libvirt.host [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.987 187256 DEBUG nova.virt.libvirt.host [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.989 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.990 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.990 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.990 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.991 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.991 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.991 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.991 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.992 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.992 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.992 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.993 187256 DEBUG nova.virt.hardware [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:19:52 np0005538960 nova_compute[187252]: 2025-11-28 16:19:52.993 187256 DEBUG nova.objects.instance [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'vcpu_model' on Instance uuid 8b80b6d9-b521-40f5-be13-ef8f7196f818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.020 187256 DEBUG oslo_concurrency.processutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.083 187256 DEBUG oslo_concurrency.processutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk.config --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.084 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "/var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.084 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "/var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.085 187256 DEBUG oslo_concurrency.lockutils [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "/var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.087 187256 DEBUG nova.virt.libvirt.vif [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-293430170',display_name='tempest-TestNetworkAdvancedServerOps-server-293430170',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-293430170',id=2,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIk3YO+dFZG5+YRQwZvUYKMKey3Y4NVkpxhGbNpTVWLTMN+NNatZCIv1+AM8vk/TswcxbpEoWIRMm0TdvsUuk1vjwuvKVAK++OHLZLDy73NJk7EOSc0UGWpowmldPoBhxA==',key_name='tempest-TestNetworkAdvancedServerOps-567897740',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:18:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-bf562qq9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:19:42Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=8b80b6d9-b521-40f5-be13-ef8f7196f818,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1328571052", "vif_mac": "fa:16:3e:f3:98:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.087 187256 DEBUG nova.network.os_vif_util [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Converting VIF {"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1328571052", "vif_mac": "fa:16:3e:f3:98:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.088 187256 DEBUG nova.network.os_vif_util [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:98:56,bridge_name='br-int',has_traffic_filtering=True,id=9d1a652f-be5e-4b1b-b759-f17b3181308b,network=Network(3f0dcdca-a602-45f0-90ca-94c068cfb9fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d1a652f-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.091 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <uuid>8b80b6d9-b521-40f5-be13-ef8f7196f818</uuid>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <name>instance-00000002</name>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-293430170</nova:name>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:19:52</nova:creationTime>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        <nova:user uuid="5d381eba17324dd5ad798648b82d0115">tempest-TestNetworkAdvancedServerOps-762685809-project-member</nova:user>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        <nova:project uuid="7e408bace48b41a1ac0677d300b6d288">tempest-TestNetworkAdvancedServerOps-762685809</nova:project>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        <nova:port uuid="9d1a652f-be5e-4b1b-b759-f17b3181308b">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <entry name="serial">8b80b6d9-b521-40f5-be13-ef8f7196f818</entry>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <entry name="uuid">8b80b6d9-b521-40f5-be13-ef8f7196f818</entry>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk.config"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:f3:98:56"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <target dev="tap9d1a652f-be"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/console.log" append="off"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:19:53 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:19:53 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:19:53 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:19:53 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.092 187256 DEBUG nova.virt.libvirt.vif [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-293430170',display_name='tempest-TestNetworkAdvancedServerOps-server-293430170',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-293430170',id=2,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIk3YO+dFZG5+YRQwZvUYKMKey3Y4NVkpxhGbNpTVWLTMN+NNatZCIv1+AM8vk/TswcxbpEoWIRMm0TdvsUuk1vjwuvKVAK++OHLZLDy73NJk7EOSc0UGWpowmldPoBhxA==',key_name='tempest-TestNetworkAdvancedServerOps-567897740',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:18:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-bf562qq9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:19:42Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=8b80b6d9-b521-40f5-be13-ef8f7196f818,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1328571052", "vif_mac": "fa:16:3e:f3:98:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.092 187256 DEBUG nova.network.os_vif_util [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Converting VIF {"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1328571052", "vif_mac": "fa:16:3e:f3:98:56"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.093 187256 DEBUG nova.network.os_vif_util [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:98:56,bridge_name='br-int',has_traffic_filtering=True,id=9d1a652f-be5e-4b1b-b759-f17b3181308b,network=Network(3f0dcdca-a602-45f0-90ca-94c068cfb9fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d1a652f-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.093 187256 DEBUG os_vif [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:98:56,bridge_name='br-int',has_traffic_filtering=True,id=9d1a652f-be5e-4b1b-b759-f17b3181308b,network=Network(3f0dcdca-a602-45f0-90ca-94c068cfb9fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d1a652f-be') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.094 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.094 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.095 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.097 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.098 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d1a652f-be, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.098 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9d1a652f-be, col_values=(('external_ids', {'iface-id': '9d1a652f-be5e-4b1b-b759-f17b3181308b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f3:98:56', 'vm-uuid': '8b80b6d9-b521-40f5-be13-ef8f7196f818'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.100 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:53 np0005538960 NetworkManager[55548]: <info>  [1764346793.1008] manager: (tap9d1a652f-be): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.102 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.106 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.106 187256 INFO os_vif [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:98:56,bridge_name='br-int',has_traffic_filtering=True,id=9d1a652f-be5e-4b1b-b759-f17b3181308b,network=Network(3f0dcdca-a602-45f0-90ca-94c068cfb9fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d1a652f-be')#033[00m
Nov 28 11:19:53 np0005538960 podman[215022]: 2025-11-28 16:19:53.207643301 +0000 UTC m=+0.101136238 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.295 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.296 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.296 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] No VIF found with MAC fa:16:3e:f3:98:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.297 187256 INFO nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Using config drive#033[00m
Nov 28 11:19:53 np0005538960 kernel: tap9d1a652f-be: entered promiscuous mode
Nov 28 11:19:53 np0005538960 NetworkManager[55548]: <info>  [1764346793.3491] manager: (tap9d1a652f-be): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Nov 28 11:19:53 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:53Z|00043|binding|INFO|Claiming lport 9d1a652f-be5e-4b1b-b759-f17b3181308b for this chassis.
Nov 28 11:19:53 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:53Z|00044|binding|INFO|9d1a652f-be5e-4b1b-b759-f17b3181308b: Claiming fa:16:3e:f3:98:56 10.100.0.9
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.351 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:53 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:53Z|00045|binding|INFO|Setting lport 9d1a652f-be5e-4b1b-b759-f17b3181308b ovn-installed in OVS
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.366 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.368 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:53 np0005538960 systemd-machined[153518]: New machine qemu-3-instance-00000002.
Nov 28 11:19:53 np0005538960 systemd-udevd[215068]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:19:53 np0005538960 systemd[1]: Started Virtual Machine qemu-3-instance-00000002.
Nov 28 11:19:53 np0005538960 NetworkManager[55548]: <info>  [1764346793.4096] device (tap9d1a652f-be): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:19:53 np0005538960 NetworkManager[55548]: <info>  [1764346793.4117] device (tap9d1a652f-be): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.778 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346793.7780168, 8b80b6d9-b521-40f5-be13-ef8f7196f818 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.779 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.781 187256 DEBUG nova.compute.manager [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.785 187256 INFO nova.virt.libvirt.driver [-] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Instance running successfully.#033[00m
Nov 28 11:19:53 np0005538960 virtqemud[186797]: argument unsupported: QEMU guest agent is not configured
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.787 187256 DEBUG nova.virt.libvirt.guest [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.787 187256 DEBUG nova.virt.libvirt.driver [None req-40d3421a-ccb5-43ef-9010-a3b097faab57 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.804 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.807 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.946 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.946 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346793.7791138, 8b80b6d9-b521-40f5-be13-ef8f7196f818 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.946 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] VM Started (Lifecycle Event)#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.962 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.965 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:19:53 np0005538960 nova_compute[187252]: 2025-11-28 16:19:53.978 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.402 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:98:56 10.100.0.9'], port_security=['fa:16:3e:f3:98:56 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8b80b6d9-b521-40f5-be13-ef8f7196f818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f0dcdca-a602-45f0-90ca-94c068cfb9fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'd53b20f5-0017-4567-861d-90c63047bf99', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.234'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=537c4bad-f0cb-43d1-9991-03edbd05ac67, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=9d1a652f-be5e-4b1b-b759-f17b3181308b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:19:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:54Z|00046|binding|INFO|Setting lport 9d1a652f-be5e-4b1b-b759-f17b3181308b up in Southbound
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.403 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 9d1a652f-be5e-4b1b-b759-f17b3181308b in datapath 3f0dcdca-a602-45f0-90ca-94c068cfb9fe bound to our chassis#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.405 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3f0dcdca-a602-45f0-90ca-94c068cfb9fe#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.415 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[50a2870b-4dc1-4fc2-943c-02661fccd227]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.416 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3f0dcdca-a1 in ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.417 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3f0dcdca-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.418 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[be3e96e8-303c-452e-aafe-029304e02719]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.419 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5d319f9b-79e6-4d1b-9fb2-b4f39868b44d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.431 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[c896e8f6-0119-4a6d-8cac-bd15cd4a4b81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.446 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[4739189b-2bce-416a-93ec-f29814422912]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.474 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[41abef38-7451-420f-9ec0-2a9843033967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 systemd-udevd[215070]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.480 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b168d67b-21a2-4894-a729-4926bb21c596]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 NetworkManager[55548]: <info>  [1764346794.4814] manager: (tap3f0dcdca-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.510 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[fa4e52ed-b0b9-42c1-8803-0ede0eea811c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.513 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6178e1-73a4-45f6-832e-62cfd615ad0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 NetworkManager[55548]: <info>  [1764346794.5380] device (tap3f0dcdca-a0): carrier: link connected
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.543 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[4ca90e86-66e9-441c-8538-1ac97c48a9df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.559 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3746d3d4-33be-406a-8cb3-a4503ebaa798]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f0dcdca-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:61:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389411, 'reachable_time': 44435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215109, 'error': None, 'target': 'ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.575 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c0a6b782-e267-4b31-81f4-2023a3cbe991]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe94:61b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 389411, 'tstamp': 389411}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215111, 'error': None, 'target': 'ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.591 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[467dc718-29c4-4dd1-a664-3f7311a5b12c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3f0dcdca-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:94:61:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389411, 'reachable_time': 44435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215112, 'error': None, 'target': 'ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.619 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[700c4aa4-db61-428c-aae3-2eafa8f22e87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.671 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7b7e26b2-719a-4e71-b06b-1a6c5ba69b06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.673 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f0dcdca-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.673 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.673 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3f0dcdca-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:54 np0005538960 kernel: tap3f0dcdca-a0: entered promiscuous mode
Nov 28 11:19:54 np0005538960 NetworkManager[55548]: <info>  [1764346794.6763] manager: (tap3f0dcdca-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Nov 28 11:19:54 np0005538960 nova_compute[187252]: 2025-11-28 16:19:54.677 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.678 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3f0dcdca-a0, col_values=(('external_ids', {'iface-id': '97cf9afd-a8cb-4167-bf07-d150bd4295dc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:19:54 np0005538960 nova_compute[187252]: 2025-11-28 16:19:54.679 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:19:54Z|00047|binding|INFO|Releasing lport 97cf9afd-a8cb-4167-bf07-d150bd4295dc from this chassis (sb_readonly=0)
Nov 28 11:19:54 np0005538960 nova_compute[187252]: 2025-11-28 16:19:54.691 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.692 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3f0dcdca-a602-45f0-90ca-94c068cfb9fe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3f0dcdca-a602-45f0-90ca-94c068cfb9fe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.693 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[22c0e40f-5b32-4c73-8121-a80ead94406c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.694 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-3f0dcdca-a602-45f0-90ca-94c068cfb9fe
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/3f0dcdca-a602-45f0-90ca-94c068cfb9fe.pid.haproxy
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID 3f0dcdca-a602-45f0-90ca-94c068cfb9fe
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:19:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:54.694 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe', 'env', 'PROCESS_TAG=haproxy-3f0dcdca-a602-45f0-90ca-94c068cfb9fe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3f0dcdca-a602-45f0-90ca-94c068cfb9fe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:19:55 np0005538960 podman[215145]: 2025-11-28 16:19:55.056325445 +0000 UTC m=+0.049942790 container create 35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:19:55 np0005538960 systemd[1]: Started libpod-conmon-35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc.scope.
Nov 28 11:19:55 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:19:55 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f44888907afc926e8cfa94a877f0a32a1bcff7cc05a47b2e11e951f3e0f1071b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:19:55 np0005538960 podman[215145]: 2025-11-28 16:19:55.030005368 +0000 UTC m=+0.023622733 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:19:55 np0005538960 podman[215145]: 2025-11-28 16:19:55.127342804 +0000 UTC m=+0.120960209 container init 35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 11:19:55 np0005538960 podman[215145]: 2025-11-28 16:19:55.132330774 +0000 UTC m=+0.125948129 container start 35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:19:55 np0005538960 neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe[215160]: [NOTICE]   (215164) : New worker (215166) forked
Nov 28 11:19:55 np0005538960 neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe[215160]: [NOTICE]   (215164) : Loading success.
Nov 28 11:19:56 np0005538960 irqbalance[782]: Cannot change IRQ 26 affinity: Operation not permitted
Nov 28 11:19:56 np0005538960 irqbalance[782]: IRQ 26 affinity is now unmanaged
Nov 28 11:19:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:56.862 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:19:56 np0005538960 nova_compute[187252]: 2025-11-28 16:19:56.863 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:19:56.865 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:19:57 np0005538960 podman[215176]: 2025-11-28 16:19:57.16326995 +0000 UTC m=+0.066003688 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 11:19:57 np0005538960 podman[215175]: 2025-11-28 16:19:57.163987046 +0000 UTC m=+0.065746471 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:19:57 np0005538960 nova_compute[187252]: 2025-11-28 16:19:57.208 187256 DEBUG nova.network.neutron [req-e2beca32-95c3-40bd-a548-a0475ce30894 req-efeb21eb-4a19-4fed-9c5a-24dd978966df 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Updated VIF entry in instance network info cache for port 9d1a652f-be5e-4b1b-b759-f17b3181308b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:19:57 np0005538960 nova_compute[187252]: 2025-11-28 16:19:57.209 187256 DEBUG nova.network.neutron [req-e2beca32-95c3-40bd-a548-a0475ce30894 req-efeb21eb-4a19-4fed-9c5a-24dd978966df 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Updating instance_info_cache with network_info: [{"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:19:57 np0005538960 nova_compute[187252]: 2025-11-28 16:19:57.308 187256 DEBUG oslo_concurrency.lockutils [req-e2beca32-95c3-40bd-a548-a0475ce30894 req-efeb21eb-4a19-4fed-9c5a-24dd978966df 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:19:57 np0005538960 nova_compute[187252]: 2025-11-28 16:19:57.409 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:58 np0005538960 nova_compute[187252]: 2025-11-28 16:19:58.100 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:19:59 np0005538960 nova_compute[187252]: 2025-11-28 16:19:59.569 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:02 np0005538960 nova_compute[187252]: 2025-11-28 16:20:02.281 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:02 np0005538960 nova_compute[187252]: 2025-11-28 16:20:02.412 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:02 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:02.866 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:20:03 np0005538960 nova_compute[187252]: 2025-11-28 16:20:03.102 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:03 np0005538960 podman[215215]: 2025-11-28 16:20:03.158951204 +0000 UTC m=+0.063986560 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.532 187256 DEBUG nova.compute.manager [req-f8a137fb-6c3d-41ef-8f32-978674a24159 req-fc313a1c-3663-4599-9cf2-825bc52e505c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.534 187256 DEBUG oslo_concurrency.lockutils [req-f8a137fb-6c3d-41ef-8f32-978674a24159 req-fc313a1c-3663-4599-9cf2-825bc52e505c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.534 187256 DEBUG oslo_concurrency.lockutils [req-f8a137fb-6c3d-41ef-8f32-978674a24159 req-fc313a1c-3663-4599-9cf2-825bc52e505c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.534 187256 DEBUG oslo_concurrency.lockutils [req-f8a137fb-6c3d-41ef-8f32-978674a24159 req-fc313a1c-3663-4599-9cf2-825bc52e505c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.535 187256 DEBUG nova.compute.manager [req-f8a137fb-6c3d-41ef-8f32-978674a24159 req-fc313a1c-3663-4599-9cf2-825bc52e505c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] No waiting events found dispatching network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.535 187256 WARNING nova.compute.manager [req-f8a137fb-6c3d-41ef-8f32-978674a24159 req-fc313a1c-3663-4599-9cf2-825bc52e505c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received unexpected event network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b for instance with vm_state resized and task_state None.#033[00m
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.782 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764346789.781669, 355865cd-5e10-47f6-9353-89df93d33afb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.783 187256 INFO nova.compute.manager [-] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:20:04 np0005538960 nova_compute[187252]: 2025-11-28 16:20:04.804 187256 DEBUG nova.compute.manager [None req-2b90033d-8b9b-44d7-a47e-fd297f1346b2 - - - - - -] [instance: 355865cd-5e10-47f6-9353-89df93d33afb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:20:06 np0005538960 podman[215243]: 2025-11-28 16:20:06.212344591 +0000 UTC m=+0.106118020 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7)
Nov 28 11:20:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:06Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f3:98:56 10.100.0.9
Nov 28 11:20:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:06.339 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:06.340 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:06.341 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:07 np0005538960 nova_compute[187252]: 2025-11-28 16:20:07.136 187256 DEBUG nova.compute.manager [req-e29aeb72-a967-4749-97a6-31728b162c73 req-43c74ba3-f086-4a0e-9944-d86951709962 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:07 np0005538960 nova_compute[187252]: 2025-11-28 16:20:07.137 187256 DEBUG oslo_concurrency.lockutils [req-e29aeb72-a967-4749-97a6-31728b162c73 req-43c74ba3-f086-4a0e-9944-d86951709962 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:07 np0005538960 nova_compute[187252]: 2025-11-28 16:20:07.137 187256 DEBUG oslo_concurrency.lockutils [req-e29aeb72-a967-4749-97a6-31728b162c73 req-43c74ba3-f086-4a0e-9944-d86951709962 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:07 np0005538960 nova_compute[187252]: 2025-11-28 16:20:07.137 187256 DEBUG oslo_concurrency.lockutils [req-e29aeb72-a967-4749-97a6-31728b162c73 req-43c74ba3-f086-4a0e-9944-d86951709962 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:07 np0005538960 nova_compute[187252]: 2025-11-28 16:20:07.138 187256 DEBUG nova.compute.manager [req-e29aeb72-a967-4749-97a6-31728b162c73 req-43c74ba3-f086-4a0e-9944-d86951709962 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] No waiting events found dispatching network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:20:07 np0005538960 nova_compute[187252]: 2025-11-28 16:20:07.138 187256 WARNING nova.compute.manager [req-e29aeb72-a967-4749-97a6-31728b162c73 req-43c74ba3-f086-4a0e-9944-d86951709962 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received unexpected event network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b for instance with vm_state active and task_state None.#033[00m
Nov 28 11:20:07 np0005538960 nova_compute[187252]: 2025-11-28 16:20:07.414 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:07 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:07Z|00048|binding|INFO|Releasing lport 97cf9afd-a8cb-4167-bf07-d150bd4295dc from this chassis (sb_readonly=0)
Nov 28 11:20:07 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:07Z|00049|binding|INFO|Releasing lport 8da911b6-4b06-444b-b895-eebe136e2189 from this chassis (sb_readonly=0)
Nov 28 11:20:07 np0005538960 nova_compute[187252]: 2025-11-28 16:20:07.962 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:08 np0005538960 nova_compute[187252]: 2025-11-28 16:20:08.104 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:09 np0005538960 nova_compute[187252]: 2025-11-28 16:20:09.298 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:11 np0005538960 nova_compute[187252]: 2025-11-28 16:20:11.748 187256 INFO nova.compute.manager [None req-a90b088a-6ba6-403c-bf1e-3203b700d89a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Get console output#033[00m
Nov 28 11:20:11 np0005538960 nova_compute[187252]: 2025-11-28 16:20:11.754 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:20:12 np0005538960 nova_compute[187252]: 2025-11-28 16:20:12.416 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:12 np0005538960 nova_compute[187252]: 2025-11-28 16:20:12.633 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.030 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.105 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.352 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.353 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.353 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.354 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.467 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.534 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.535 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.596 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.603 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.665 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.666 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.729 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.927 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.929 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5424MB free_disk=73.28720474243164GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.930 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:13 np0005538960 nova_compute[187252]: 2025-11-28 16:20:13.930 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:14 np0005538960 nova_compute[187252]: 2025-11-28 16:20:14.041 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance eff28834-4c5b-46d0-90a8-4be63b9fff80 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:20:14 np0005538960 nova_compute[187252]: 2025-11-28 16:20:14.041 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance 8b80b6d9-b521-40f5-be13-ef8f7196f818 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:20:14 np0005538960 nova_compute[187252]: 2025-11-28 16:20:14.041 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:20:14 np0005538960 nova_compute[187252]: 2025-11-28 16:20:14.041 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:20:14 np0005538960 nova_compute[187252]: 2025-11-28 16:20:14.124 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:20:14 np0005538960 nova_compute[187252]: 2025-11-28 16:20:14.138 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:20:14 np0005538960 nova_compute[187252]: 2025-11-28 16:20:14.169 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:20:14 np0005538960 nova_compute[187252]: 2025-11-28 16:20:14.170 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.043 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "8b80b6d9-b521-40f5-be13-ef8f7196f818" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.044 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.044 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.044 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.045 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.046 187256 INFO nova.compute.manager [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Terminating instance#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.047 187256 DEBUG nova.compute.manager [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:20:15 np0005538960 kernel: tap9d1a652f-be (unregistering): left promiscuous mode
Nov 28 11:20:15 np0005538960 NetworkManager[55548]: <info>  [1764346815.0787] device (tap9d1a652f-be): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:20:15 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:15Z|00050|binding|INFO|Releasing lport 9d1a652f-be5e-4b1b-b759-f17b3181308b from this chassis (sb_readonly=0)
Nov 28 11:20:15 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:15Z|00051|binding|INFO|Setting lport 9d1a652f-be5e-4b1b-b759-f17b3181308b down in Southbound
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.100 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:15 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:15Z|00052|binding|INFO|Removing iface tap9d1a652f-be ovn-installed in OVS
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.102 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.114 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:98:56 10.100.0.9'], port_security=['fa:16:3e:f3:98:56 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8b80b6d9-b521-40f5-be13-ef8f7196f818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3f0dcdca-a602-45f0-90ca-94c068cfb9fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'd53b20f5-0017-4567-861d-90c63047bf99', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=537c4bad-f0cb-43d1-9991-03edbd05ac67, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=9d1a652f-be5e-4b1b-b759-f17b3181308b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.115 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 9d1a652f-be5e-4b1b-b759-f17b3181308b in datapath 3f0dcdca-a602-45f0-90ca-94c068cfb9fe unbound from our chassis#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.119 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.118 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3f0dcdca-a602-45f0-90ca-94c068cfb9fe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.119 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[929502a1-ca28-430d-a467-ea59db5edb27]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.120 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe namespace which is not needed anymore#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.139 187256 DEBUG nova.compute.manager [req-8a7e2d2a-299f-45fc-95cc-61ded26d52a7 req-4c2e2c6f-b644-44cd-b622-10ae37c6d205 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-changed-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.140 187256 DEBUG nova.compute.manager [req-8a7e2d2a-299f-45fc-95cc-61ded26d52a7 req-4c2e2c6f-b644-44cd-b622-10ae37c6d205 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Refreshing instance network info cache due to event network-changed-9d1a652f-be5e-4b1b-b759-f17b3181308b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.141 187256 DEBUG oslo_concurrency.lockutils [req-8a7e2d2a-299f-45fc-95cc-61ded26d52a7 req-4c2e2c6f-b644-44cd-b622-10ae37c6d205 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.141 187256 DEBUG oslo_concurrency.lockutils [req-8a7e2d2a-299f-45fc-95cc-61ded26d52a7 req-4c2e2c6f-b644-44cd-b622-10ae37c6d205 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.141 187256 DEBUG nova.network.neutron [req-8a7e2d2a-299f-45fc-95cc-61ded26d52a7 req-4c2e2c6f-b644-44cd-b622-10ae37c6d205 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Refreshing network info cache for port 9d1a652f-be5e-4b1b-b759-f17b3181308b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:20:15 np0005538960 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000002.scope: Deactivated successfully.
Nov 28 11:20:15 np0005538960 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000002.scope: Consumed 13.477s CPU time.
Nov 28 11:20:15 np0005538960 systemd-machined[153518]: Machine qemu-3-instance-00000002 terminated.
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.170 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.171 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:20:15 np0005538960 podman[215289]: 2025-11-28 16:20:15.177545942 +0000 UTC m=+0.069853222 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:20:15 np0005538960 kernel: tap9d1a652f-be: entered promiscuous mode
Nov 28 11:20:15 np0005538960 kernel: tap9d1a652f-be (unregistering): left promiscuous mode
Nov 28 11:20:15 np0005538960 NetworkManager[55548]: <info>  [1764346815.2728] manager: (tap9d1a652f-be): new Tun device (/org/freedesktop/NetworkManager/Devices/37)
Nov 28 11:20:15 np0005538960 neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe[215160]: [NOTICE]   (215164) : haproxy version is 2.8.14-c23fe91
Nov 28 11:20:15 np0005538960 neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe[215160]: [NOTICE]   (215164) : path to executable is /usr/sbin/haproxy
Nov 28 11:20:15 np0005538960 neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe[215160]: [WARNING]  (215164) : Exiting Master process...
Nov 28 11:20:15 np0005538960 neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe[215160]: [WARNING]  (215164) : Exiting Master process...
Nov 28 11:20:15 np0005538960 neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe[215160]: [ALERT]    (215164) : Current worker (215166) exited with code 143 (Terminated)
Nov 28 11:20:15 np0005538960 neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe[215160]: [WARNING]  (215164) : All workers exited. Exiting... (0)
Nov 28 11:20:15 np0005538960 systemd[1]: libpod-35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc.scope: Deactivated successfully.
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.279 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:15 np0005538960 podman[215330]: 2025-11-28 16:20:15.285046843 +0000 UTC m=+0.059528442 container died 35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:15 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc-userdata-shm.mount: Deactivated successfully.
Nov 28 11:20:15 np0005538960 systemd[1]: var-lib-containers-storage-overlay-f44888907afc926e8cfa94a877f0a32a1bcff7cc05a47b2e11e951f3e0f1071b-merged.mount: Deactivated successfully.
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.326 187256 INFO nova.virt.libvirt.driver [-] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Instance destroyed successfully.#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.326 187256 DEBUG nova.objects.instance [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'resources' on Instance uuid 8b80b6d9-b521-40f5-be13-ef8f7196f818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.340 187256 DEBUG nova.virt.libvirt.vif [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:18:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-293430170',display_name='tempest-TestNetworkAdvancedServerOps-server-293430170',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-293430170',id=2,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIk3YO+dFZG5+YRQwZvUYKMKey3Y4NVkpxhGbNpTVWLTMN+NNatZCIv1+AM8vk/TswcxbpEoWIRMm0TdvsUuk1vjwuvKVAK++OHLZLDy73NJk7EOSc0UGWpowmldPoBhxA==',key_name='tempest-TestNetworkAdvancedServerOps-567897740',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:19:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-bf562qq9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:20:04Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=8b80b6d9-b521-40f5-be13-ef8f7196f818,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.341 187256 DEBUG nova.network.os_vif_util [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.342 187256 DEBUG nova.network.os_vif_util [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f3:98:56,bridge_name='br-int',has_traffic_filtering=True,id=9d1a652f-be5e-4b1b-b759-f17b3181308b,network=Network(3f0dcdca-a602-45f0-90ca-94c068cfb9fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d1a652f-be') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.342 187256 DEBUG os_vif [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:98:56,bridge_name='br-int',has_traffic_filtering=True,id=9d1a652f-be5e-4b1b-b759-f17b3181308b,network=Network(3f0dcdca-a602-45f0-90ca-94c068cfb9fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d1a652f-be') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.344 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.345 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d1a652f-be, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.348 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.351 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:20:15 np0005538960 podman[215330]: 2025-11-28 16:20:15.352555117 +0000 UTC m=+0.127036696 container cleanup 35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.354 187256 INFO os_vif [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f3:98:56,bridge_name='br-int',has_traffic_filtering=True,id=9d1a652f-be5e-4b1b-b759-f17b3181308b,network=Network(3f0dcdca-a602-45f0-90ca-94c068cfb9fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d1a652f-be')#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.355 187256 INFO nova.virt.libvirt.driver [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Deleting instance files /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818_del#033[00m
Nov 28 11:20:15 np0005538960 systemd[1]: libpod-conmon-35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc.scope: Deactivated successfully.
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.362 187256 INFO nova.virt.libvirt.driver [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Deletion of /var/lib/nova/instances/8b80b6d9-b521-40f5-be13-ef8f7196f818_del complete#033[00m
Nov 28 11:20:15 np0005538960 podman[215375]: 2025-11-28 16:20:15.427562892 +0000 UTC m=+0.048095735 container remove 35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.433 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a98bf608-40b4-4c42-924d-37e3a1b08619]: (4, ('Fri Nov 28 04:20:15 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe (35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc)\n35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc\nFri Nov 28 04:20:15 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe (35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc)\n35019f0a8a8a182d4bfc0fe4c86708fee8d7329f91eb5e650e537da5986dd3cc\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.436 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e672d3db-99de-4534-9842-f31d3bb20d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.437 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3f0dcdca-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.441 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:15 np0005538960 kernel: tap3f0dcdca-a0: left promiscuous mode
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.447 187256 INFO nova.compute.manager [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.447 187256 DEBUG oslo.service.loopingcall [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.448 187256 DEBUG nova.compute.manager [-] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.448 187256 DEBUG nova.network.neutron [-] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.448 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[318f3461-b457-4d61-a455-d7ef55ef489c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:15 np0005538960 nova_compute[187252]: 2025-11-28 16:20:15.454 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.463 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a743a6b8-ea5c-402d-8a5b-4641d8681202]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.465 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[96eaf8c1-c327-426f-a271-1e34dcb10816]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.482 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6efda5b7-8487-4034-954b-1db5e0e4f952]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 389405, 'reachable_time': 31596, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215390, 'error': None, 'target': 'ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.485 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3f0dcdca-a602-45f0-90ca-94c068cfb9fe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:20:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:15.485 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[d68d47ef-58c3-481c-ac05-717738b26497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:15 np0005538960 systemd[1]: run-netns-ovnmeta\x2d3f0dcdca\x2da602\x2d45f0\x2d90ca\x2d94c068cfb9fe.mount: Deactivated successfully.
Nov 28 11:20:16 np0005538960 nova_compute[187252]: 2025-11-28 16:20:16.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.321 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.321 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.321 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.338 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.415 187256 DEBUG nova.network.neutron [-] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.420 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.489 187256 INFO nova.compute.manager [-] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Took 2.04 seconds to deallocate network for instance.#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.549 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.550 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.588 187256 DEBUG nova.compute.manager [req-07f0caa5-6b43-457a-9075-b05f929a579d req-b3d97ee6-f7da-4266-9c83-7517c9254549 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-vif-deleted-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.627 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.627 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.627 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.628 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid eff28834-4c5b-46d0-90a8-4be63b9fff80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.643 187256 DEBUG nova.compute.provider_tree [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.658 187256 DEBUG nova.scheduler.client.report [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.682 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.714 187256 INFO nova.scheduler.client.report [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Deleted allocations for instance 8b80b6d9-b521-40f5-be13-ef8f7196f818#033[00m
Nov 28 11:20:17 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:17Z|00053|binding|INFO|Releasing lport 8da911b6-4b06-444b-b895-eebe136e2189 from this chassis (sb_readonly=0)
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.824 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:17 np0005538960 nova_compute[187252]: 2025-11-28 16:20:17.831 187256 DEBUG oslo_concurrency.lockutils [None req-e03937af-e3af-44e6-9020-89a2910fb199 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.787s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:18 np0005538960 podman[215391]: 2025-11-28 16:20:18.163339253 +0000 UTC m=+0.060458125 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.347 187256 DEBUG nova.network.neutron [req-8a7e2d2a-299f-45fc-95cc-61ded26d52a7 req-4c2e2c6f-b644-44cd-b622-10ae37c6d205 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Updated VIF entry in instance network info cache for port 9d1a652f-be5e-4b1b-b759-f17b3181308b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.348 187256 DEBUG nova.network.neutron [req-8a7e2d2a-299f-45fc-95cc-61ded26d52a7 req-4c2e2c6f-b644-44cd-b622-10ae37c6d205 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Updating instance_info_cache with network_info: [{"id": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "address": "fa:16:3e:f3:98:56", "network": {"id": "3f0dcdca-a602-45f0-90ca-94c068cfb9fe", "bridge": "br-int", "label": "tempest-network-smoke--1328571052", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d1a652f-be", "ovs_interfaceid": "9d1a652f-be5e-4b1b-b759-f17b3181308b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.370 187256 DEBUG oslo_concurrency.lockutils [req-8a7e2d2a-299f-45fc-95cc-61ded26d52a7 req-4c2e2c6f-b644-44cd-b622-10ae37c6d205 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-8b80b6d9-b521-40f5-be13-ef8f7196f818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.432 187256 DEBUG nova.compute.manager [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-vif-unplugged-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.433 187256 DEBUG oslo_concurrency.lockutils [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.433 187256 DEBUG oslo_concurrency.lockutils [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.433 187256 DEBUG oslo_concurrency.lockutils [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.433 187256 DEBUG nova.compute.manager [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] No waiting events found dispatching network-vif-unplugged-9d1a652f-be5e-4b1b-b759-f17b3181308b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.433 187256 WARNING nova.compute.manager [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received unexpected event network-vif-unplugged-9d1a652f-be5e-4b1b-b759-f17b3181308b for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.434 187256 DEBUG nova.compute.manager [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received event network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.434 187256 DEBUG oslo_concurrency.lockutils [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.434 187256 DEBUG oslo_concurrency.lockutils [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.434 187256 DEBUG oslo_concurrency.lockutils [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8b80b6d9-b521-40f5-be13-ef8f7196f818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.434 187256 DEBUG nova.compute.manager [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] No waiting events found dispatching network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:20:18 np0005538960 nova_compute[187252]: 2025-11-28 16:20:18.434 187256 WARNING nova.compute.manager [req-f966ac89-2654-47c3-b0b3-6173cbaaf26f req-7723d596-3898-40db-83df-dd37384f6e5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Received unexpected event network-vif-plugged-9d1a652f-be5e-4b1b-b759-f17b3181308b for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:20:20 np0005538960 nova_compute[187252]: 2025-11-28 16:20:20.347 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:22 np0005538960 nova_compute[187252]: 2025-11-28 16:20:22.406 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updating instance_info_cache with network_info: [{"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:20:22 np0005538960 nova_compute[187252]: 2025-11-28 16:20:22.421 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:20:22 np0005538960 nova_compute[187252]: 2025-11-28 16:20:22.421 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:20:22 np0005538960 nova_compute[187252]: 2025-11-28 16:20:22.421 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:22 np0005538960 nova_compute[187252]: 2025-11-28 16:20:22.422 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:22 np0005538960 nova_compute[187252]: 2025-11-28 16:20:22.423 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:23 np0005538960 nova_compute[187252]: 2025-11-28 16:20:23.413 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:20:24 np0005538960 podman[215415]: 2025-11-28 16:20:24.215311859 +0000 UTC m=+0.112131754 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:20:25 np0005538960 nova_compute[187252]: 2025-11-28 16:20:25.351 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:27 np0005538960 nova_compute[187252]: 2025-11-28 16:20:27.428 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:28 np0005538960 podman[215443]: 2025-11-28 16:20:28.161192982 +0000 UTC m=+0.060268380 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 11:20:28 np0005538960 podman[215444]: 2025-11-28 16:20:28.180031548 +0000 UTC m=+0.076808450 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 11:20:30 np0005538960 nova_compute[187252]: 2025-11-28 16:20:30.324 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764346815.3231828, 8b80b6d9-b521-40f5-be13-ef8f7196f818 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:20:30 np0005538960 nova_compute[187252]: 2025-11-28 16:20:30.325 187256 INFO nova.compute.manager [-] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:20:30 np0005538960 nova_compute[187252]: 2025-11-28 16:20:30.349 187256 DEBUG nova.compute.manager [None req-d2b14c80-9335-4db4-adeb-6ffd7d79cfe6 - - - - - -] [instance: 8b80b6d9-b521-40f5-be13-ef8f7196f818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:20:30 np0005538960 nova_compute[187252]: 2025-11-28 16:20:30.354 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:32 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:32Z|00054|binding|INFO|Releasing lport 8da911b6-4b06-444b-b895-eebe136e2189 from this chassis (sb_readonly=0)
Nov 28 11:20:32 np0005538960 nova_compute[187252]: 2025-11-28 16:20:32.146 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:32 np0005538960 nova_compute[187252]: 2025-11-28 16:20:32.268 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:32 np0005538960 nova_compute[187252]: 2025-11-28 16:20:32.431 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:34 np0005538960 podman[215482]: 2025-11-28 16:20:34.155921444 +0000 UTC m=+0.055431503 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:20:35 np0005538960 nova_compute[187252]: 2025-11-28 16:20:35.357 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.801 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}8f3bced7c182836bc7f681b45915233ccdc90c73eccbd27ac1996f2cbe22b8e8" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.932 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Fri, 28 Nov 2025 16:20:35 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-31afbfc9-f873-4a26-81ea-e161f7ff07e1 x-openstack-request-id: req-31afbfc9-f873-4a26-81ea-e161f7ff07e1 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.933 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "60d3f730-7668-4a83-b596-bc00400d7294", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/60d3f730-7668-4a83-b596-bc00400d7294"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/60d3f730-7668-4a83-b596-bc00400d7294"}]}, {"id": "c90217bd-1e89-4c68-8e01-33bf1cee456c", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c90217bd-1e89-4c68-8e01-33bf1cee456c"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c90217bd-1e89-4c68-8e01-33bf1cee456c"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.933 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-31afbfc9-f873-4a26-81ea-e161f7ff07e1 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.936 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/c90217bd-1e89-4c68-8e01-33bf1cee456c -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}8f3bced7c182836bc7f681b45915233ccdc90c73eccbd27ac1996f2cbe22b8e8" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.978 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Fri, 28 Nov 2025 16:20:35 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-b0bbab16-4cae-47f8-9dfb-5c0ea5016b9a x-openstack-request-id: req-b0bbab16-4cae-47f8-9dfb-5c0ea5016b9a _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.978 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "c90217bd-1e89-4c68-8e01-33bf1cee456c", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/c90217bd-1e89-4c68-8e01-33bf1cee456c"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/c90217bd-1e89-4c68-8e01-33bf1cee456c"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.978 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/c90217bd-1e89-4c68-8e01-33bf1cee456c used request id req-b0bbab16-4cae-47f8-9dfb-5c0ea5016b9a request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.979 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'name': 'tempest-TestNetworkBasicOps-server-5027134', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'user_id': 'a4105532118847f583e4bf7594336693', 'hostId': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 11:20:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:35.980 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.009 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.read.latency volume: 258629787 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.010 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.read.latency volume: 25574080 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3282bad8-10cc-4875-8cfd-bea9fa056350', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 258629787, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:35.980178', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bb6e7ce-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '2c1fa2a4174b3c847934c9a11120c5cc359865ab6f7cd4b58c4571a8e77310de'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 25574080, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:35.980178', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bb6f624-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '2459e0fe7e258611d6857ddfade9cf39e75518acc6742ec033625edfaea31bc5'}]}, 'timestamp': '2025-11-28 16:20:36.010486', '_unique_id': 'dac0fccad2ca4e79a6877753b86b0b2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.017 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.020 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.025 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for eff28834-4c5b-46d0-90a8-4be63b9fff80 / tap6b498512-32 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.025 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec01b6ab-2a78-436c-ba64-07a4aa110254', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.021046', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bb94e38-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': '857e2dad8a3aa7aa163a03557c235fc84bf6fb5d3d4c8d025cf46002e5afdb1b'}]}, 'timestamp': '2025-11-28 16:20:36.026026', '_unique_id': '47da0ed25167470b8ecad995725b1ae9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.027 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.028 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.028 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-5027134>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-5027134>]
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.028 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.029 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.write.bytes volume: 73138176 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.029 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f659739-ab63-4660-95e3-c5874d1a69eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73138176, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:36.029028', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bb9d7cc-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '2015ba2f1429ab64a60059c56f97b8aab78d51113aef596f5febb0d18e59c547'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:36.029028', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bb9e12c-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '45e1dcfad7bf0b117c1284dcd3e02c3b615c4da4f72125c87309f82269f36a1d'}]}, 'timestamp': '2025-11-28 16:20:36.029579', '_unique_id': 'dcb25a9528144fcdbe40dc9a56505c2d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.030 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.outgoing.packets volume: 228 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e34b1b4e-b4cc-4759-bada-6ed00d84abf2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 228, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.031123', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bba26fa-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': '213f839a0ca30d503f498839cdcc6c6c395f0bb14027741a4e455d89f7491f2b'}]}, 'timestamp': '2025-11-28 16:20:36.031413', '_unique_id': '65873f09b55b40dda91288ed0fd18524'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.031 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.033 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.033 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.read.bytes volume: 30398976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.033 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f93b0b9-43c3-4e5d-970e-5d64414be228', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30398976, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:36.033094', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bba751a-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': 'd9c755978c8b9cdbba7679284db3a44ccd803d2430f052e370c6ae05883f36ff'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:36.033094', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bba7e0c-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '12cfe7b47030fd2c3da67e2368b384ce5cc74aa6cf51930e9f48864f337ce5f5'}]}, 'timestamp': '2025-11-28 16:20:36.033589', '_unique_id': 'a2e23729dd9945c1b453ec0c19e1aa70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.034 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.incoming.bytes volume: 39155 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ba84b17-ef26-4288-93a4-7ce82ac0adaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 39155, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.035025', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bbabfb6-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': '2e688cd66aa6b206b8689430d67450e7a959683415efa17c11c6a3ed1b221e02'}]}, 'timestamp': '2025-11-28 16:20:36.035300', '_unique_id': '28c1237e96b94144a020020256c911f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.035 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.036 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.036 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e2b18083-e12d-4254-a597-ce57af4406f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.036639', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bbaff6c-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': 'a1445cc00fda96fc5422c0cb523c7731cdd195ca7583b7a7b43421243f47a64c'}]}, 'timestamp': '2025-11-28 16:20:36.036950', '_unique_id': '4b82d7ee521143cb9feee448b3bb9df7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.037 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.038 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.038 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.write.latency volume: 4472769002 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.038 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5e8b862e-dc4e-4b0d-b59b-390ef44a7830', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4472769002, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:36.038098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bbb36c6-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '70af73cbef5dba445c29548e1eca0153765c304c15b0843c6a32cd3fe0ed2952'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:36.038098', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bbb3ef0-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '27295d55465ec54ca0a1504b515f2ee3c183dc896d469a6c30b52a253ff02189'}]}, 'timestamp': '2025-11-28 16:20:36.038523', '_unique_id': '41afb4be4233476084e4f95a2d245713'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.039 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6347e80-06a2-469e-8a4c-5fce0c8284ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.039708', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bbb7618-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': 'ff79771d5c90139f98f05cc8013d4f0812e82ef8223eb19af9c757073bf7a10f'}]}, 'timestamp': '2025-11-28 16:20:36.039971', '_unique_id': 'b9faaa7a04d74ed0a3b1b915fb5b184b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.040 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.incoming.packets volume: 204 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb856738-16a0-4851-9251-1b968cf10a44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 204, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.041045', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bbba9ee-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': 'db6822e9b9ebc74e49c305def74dc76f04baf4fe0e998f40b9202821cf0a679a'}]}, 'timestamp': '2025-11-28 16:20:36.041271', '_unique_id': '15b6b3c6619a4ab6be46fa1ddb84bed3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.041 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.042 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.042 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.042 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2aa403b0-ff32-4b7e-9ea2-ca04a96a69db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:36.042366', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bbbdee6-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': 'f4770b731201ffc74f8c849912104d5ea81d14164a93cbfaef1993d9230f08ef'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:36.042366', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bbbea08-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '7a8ec89ab7bb5e1479a9ca996560f1d92e32b7de04c979d0f04646fcaab8c2e1'}]}, 'timestamp': '2025-11-28 16:20:36.042974', '_unique_id': 'bf7a9cf7b0af427eb434f1c456015277'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.043 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.044 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.056 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.056 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '395f72bb-e723-4cd0-9145-12983d62366a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:36.044263', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bbe01bc-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.678568792, 'message_signature': '5bb0a63bc3482dfa774202b55af343b7d03cee2e1cbeb5073ea2a553e99020f9'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:36.044263', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bbe1378-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.678568792, 'message_signature': 'e29974984c1f921fbd3d7cdade3d18bf31d1a21c98f99722c740269d770306e1'}]}, 'timestamp': '2025-11-28 16:20:36.057084', '_unique_id': '860bb55f360743848df1b3c957b27eea'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.058 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '103393dc-6a00-4bcd-9d76-bb843359824f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:36.058775', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bbe5e96-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.678568792, 'message_signature': 'd230ef74b7a7c4f5602f280058432686fea350c0a68ece7301894b0ca4c15804'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:36.058775', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bbe67b0-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.678568792, 'message_signature': 'b62fb6dd491eb2067ef8706c7d3a22654a671ecef0069d88d94c6a79cc409d3c'}]}, 'timestamp': '2025-11-28 16:20:36.059222', '_unique_id': 'b806aeedefe8492face8ef682b405a92'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.059 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.060 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.060 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ebb0300-4bfb-4102-86b7-4fe047c523c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.060335', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bbe9bcc-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': 'b128c8cc7073f6b915901ae4e5d5b37e03910ff0d59f1f354ed4f143a7395142'}]}, 'timestamp': '2025-11-28 16:20:36.060570', '_unique_id': '1b29cf865e4c4396a948cf931073c512'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.write.requests volume: 341 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.061 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48756cc1-9417-4c48-bc41-4f20303fafa4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 341, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:36.061631', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bbece26-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '88702b4dc3cbde9763358394b30906d980a970b1fcd96dc2e89e60cdc513f1e2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:36.061631', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bbed6e6-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.614430481, 'message_signature': '3dc6736c98c6eb08256b03034444ca07b5915de9ff84bdeec42efe6b87198dcd'}]}, 'timestamp': '2025-11-28 16:20:36.062069', '_unique_id': 'c313e34798f94853b26ad4daa1534687'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.062 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.outgoing.bytes volume: 34038 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '736fc346-34e4-4950-9205-b28fb9a68927', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 34038, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.063134', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bbf08c8-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': '145130a3313fd41bfec956dd4f90cd1d8ff7c7306a689d790263cc9bb2ade6d8'}]}, 'timestamp': '2025-11-28 16:20:36.063361', '_unique_id': '998c843c17794b599a93280eb671f3a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.063 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.064 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.064 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.064 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-5027134>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-5027134>]
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.064 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.080 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/cpu volume: 13400000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '147789d9-d6f7-4ab9-bd96-164c68233051', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 13400000000, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'timestamp': '2025-11-28T16:20:36.064733', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '2bc1b352-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.714180275, 'message_signature': '9dfc25df635b93963aa5a4b337259f768256a0958f3ad600de4b25a0231a0648'}]}, 'timestamp': '2025-11-28 16:20:36.080933', '_unique_id': 'd3bcbcfe4f5d4d8fa437c550781ba207'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.081 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.082 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.082 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.082 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-5027134>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-5027134>]
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.083 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.083 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.083 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5b83c7c-009d-46a4-863c-89fa6bb2ff26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-vda', 'timestamp': '2025-11-28T16:20:36.083075', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2bc21428-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.678568792, 'message_signature': '00d1e8bf68750871625b358a3248dc558348fc6430be9ac6d1360cf64c851b62'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80-sda', 'timestamp': '2025-11-28T16:20:36.083075', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2bc21c34-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.678568792, 'message_signature': '5cda271d76948ef76b8f0fad983efe5c2d03d5a311322d78fe64a85f2b3a708e'}]}, 'timestamp': '2025-11-28 16:20:36.083572', '_unique_id': '0fa2d8dac8e14dda91c3a2a0916912dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.084 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '35117b44-eaea-411e-93eb-9b22fdca4126', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.084744', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bc2555a-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': '20b84c182da2fc09cfb8c5f208e47d5dde0d57cbcff91715e900eacbd946583f'}]}, 'timestamp': '2025-11-28 16:20:36.085012', '_unique_id': 'd09add25c7814635b016c0289d89399e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.085 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.086 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.086 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b783a321-db12-4b02-8191-6514bd751be8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000003-eff28834-4c5b-46d0-90a8-4be63b9fff80-tap6b498512-32', 'timestamp': '2025-11-28T16:20:36.086266', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'tap6b498512-32', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:f3:00:db', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b498512-32'}, 'message_id': '2bc29088-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.655364951, 'message_signature': '87721ae2b4fae4483fdc688e862a35a0bd47dd699977dbbf21a070b7befd86b5'}]}, 'timestamp': '2025-11-28 16:20:36.086500', '_unique_id': '5a9e9f6a62634a6c817227f17238b398'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.087 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.088 12 DEBUG ceilometer.compute.pollsters [-] eff28834-4c5b-46d0-90a8-4be63b9fff80/memory.usage volume: 42.85546875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb3d9b98-b247-4387-8e66-35f16faac2b8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.85546875, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'timestamp': '2025-11-28T16:20:36.088006', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-5027134', 'name': 'instance-00000003', 'instance_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '2bc2d6f6-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 3935.714180275, 'message_signature': '7d02017c42b21892c6a2b8b79e802ce6a6ce15d1554a44556ae8be67f406203f'}]}, 'timestamp': '2025-11-28 16:20:36.088555', '_unique_id': 'ee1a15e229d34c3db4499a49cc50e5c0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.089 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.090 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.090 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:20:36 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:20:36.090 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-5027134>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-5027134>]
Nov 28 11:20:37 np0005538960 podman[215506]: 2025-11-28 16:20:37.166997716 +0000 UTC m=+0.068798546 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Nov 28 11:20:37 np0005538960 nova_compute[187252]: 2025-11-28 16:20:37.433 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.396 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.863 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "eff28834-4c5b-46d0-90a8-4be63b9fff80" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.864 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.865 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.865 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.866 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.868 187256 INFO nova.compute.manager [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Terminating instance#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.869 187256 DEBUG nova.compute.manager [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:20:40 np0005538960 kernel: tap6b498512-32 (unregistering): left promiscuous mode
Nov 28 11:20:40 np0005538960 NetworkManager[55548]: <info>  [1764346840.8957] device (tap6b498512-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:20:40 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:40Z|00055|binding|INFO|Releasing lport 6b498512-32dd-4e59-95bd-71c3a69bb44f from this chassis (sb_readonly=0)
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.900 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:40 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:40Z|00056|binding|INFO|Setting lport 6b498512-32dd-4e59-95bd-71c3a69bb44f down in Southbound
Nov 28 11:20:40 np0005538960 ovn_controller[95460]: 2025-11-28T16:20:40Z|00057|binding|INFO|Removing iface tap6b498512-32 ovn-installed in OVS
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.905 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:40 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:40.915 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:00:db 10.100.0.14'], port_security=['fa:16:3e:f3:00:db 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'eff28834-4c5b-46d0-90a8-4be63b9fff80', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e779c78f-4948-46c1-a91a-4b1068ceaae1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '4', 'neutron:security_group_ids': '870cc982-353e-41b1-a555-537b285e8e4c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fb853e10-1f64-4b13-bf92-c660af7671ba, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=6b498512-32dd-4e59-95bd-71c3a69bb44f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:20:40 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:40.918 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 6b498512-32dd-4e59-95bd-71c3a69bb44f in datapath e779c78f-4948-46c1-a91a-4b1068ceaae1 unbound from our chassis#033[00m
Nov 28 11:20:40 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:40.921 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e779c78f-4948-46c1-a91a-4b1068ceaae1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:20:40 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:40.922 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1dcb261a-4fc6-48a5-8a3f-531032864c97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:40 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:40.923 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1 namespace which is not needed anymore#033[00m
Nov 28 11:20:40 np0005538960 nova_compute[187252]: 2025-11-28 16:20:40.937 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:40 np0005538960 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Nov 28 11:20:40 np0005538960 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 18.630s CPU time.
Nov 28 11:20:40 np0005538960 systemd-machined[153518]: Machine qemu-1-instance-00000003 terminated.
Nov 28 11:20:41 np0005538960 neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1[214379]: [NOTICE]   (214383) : haproxy version is 2.8.14-c23fe91
Nov 28 11:20:41 np0005538960 neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1[214379]: [NOTICE]   (214383) : path to executable is /usr/sbin/haproxy
Nov 28 11:20:41 np0005538960 neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1[214379]: [WARNING]  (214383) : Exiting Master process...
Nov 28 11:20:41 np0005538960 neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1[214379]: [ALERT]    (214383) : Current worker (214385) exited with code 143 (Terminated)
Nov 28 11:20:41 np0005538960 neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1[214379]: [WARNING]  (214383) : All workers exited. Exiting... (0)
Nov 28 11:20:41 np0005538960 systemd[1]: libpod-383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9.scope: Deactivated successfully.
Nov 28 11:20:41 np0005538960 podman[215551]: 2025-11-28 16:20:41.087434458 +0000 UTC m=+0.049385754 container died 383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0)
Nov 28 11:20:41 np0005538960 kernel: tap6b498512-32: entered promiscuous mode
Nov 28 11:20:41 np0005538960 kernel: tap6b498512-32 (unregistering): left promiscuous mode
Nov 28 11:20:41 np0005538960 NetworkManager[55548]: <info>  [1764346841.0923] manager: (tap6b498512-32): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.102 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:41 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9-userdata-shm.mount: Deactivated successfully.
Nov 28 11:20:41 np0005538960 systemd[1]: var-lib-containers-storage-overlay-1af84f5093ee7d3f1aa174f428b166794890ba043550a3c1d03954af8115883b-merged.mount: Deactivated successfully.
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.142 187256 INFO nova.virt.libvirt.driver [-] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Instance destroyed successfully.#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.144 187256 DEBUG nova.objects.instance [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'resources' on Instance uuid eff28834-4c5b-46d0-90a8-4be63b9fff80 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:20:41 np0005538960 podman[215551]: 2025-11-28 16:20:41.147326237 +0000 UTC m=+0.109277533 container cleanup 383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:20:41 np0005538960 systemd[1]: libpod-conmon-383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9.scope: Deactivated successfully.
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.169 187256 DEBUG nova.virt.libvirt.vif [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-5027134',display_name='tempest-TestNetworkBasicOps-server-5027134',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-5027134',id=3,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE8qss00IgI7JJJCpwpUtOPNZOKd3VbUa+RvJPQz0L1gQHQ/A/taf962uh+XblnyqL6/863JdV1hTqTrPBvdsLWy2S9tfii7CzAhBQLzdbaC8IXYKznxQHwcqbW6UCp+cA==',key_name='tempest-TestNetworkBasicOps-197299468',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:18:52Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-ngyv601g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:18:52Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=eff28834-4c5b-46d0-90a8-4be63b9fff80,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.169 187256 DEBUG nova.network.os_vif_util [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.203", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.170 187256 DEBUG nova.network.os_vif_util [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f3:00:db,bridge_name='br-int',has_traffic_filtering=True,id=6b498512-32dd-4e59-95bd-71c3a69bb44f,network=Network(e779c78f-4948-46c1-a91a-4b1068ceaae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b498512-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.170 187256 DEBUG os_vif [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:00:db,bridge_name='br-int',has_traffic_filtering=True,id=6b498512-32dd-4e59-95bd-71c3a69bb44f,network=Network(e779c78f-4948-46c1-a91a-4b1068ceaae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b498512-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.172 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.172 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b498512-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.177 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.180 187256 INFO os_vif [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f3:00:db,bridge_name='br-int',has_traffic_filtering=True,id=6b498512-32dd-4e59-95bd-71c3a69bb44f,network=Network(e779c78f-4948-46c1-a91a-4b1068ceaae1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b498512-32')#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.180 187256 INFO nova.virt.libvirt.driver [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Deleting instance files /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80_del#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.181 187256 INFO nova.virt.libvirt.driver [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Deletion of /var/lib/nova/instances/eff28834-4c5b-46d0-90a8-4be63b9fff80_del complete#033[00m
Nov 28 11:20:41 np0005538960 podman[215598]: 2025-11-28 16:20:41.23042027 +0000 UTC m=+0.056372184 container remove 383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.238 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[549e112c-263c-4f7d-a78f-70e83a2b694c]: (4, ('Fri Nov 28 04:20:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1 (383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9)\n383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9\nFri Nov 28 04:20:41 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1 (383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9)\n383f05fec8cfd3b87b9afe17ea27bdb6823baf4f0e0df1d759f243a4ffc294c9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.240 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7c94615f-7d04-42a3-8151-14f87feba0bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.241 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape779c78f-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:20:41 np0005538960 kernel: tape779c78f-40: left promiscuous mode
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.243 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.249 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[553dccf2-da99-49c4-8949-2f43166443ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.260 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.281 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[eb715f5f-f06d-4183-b001-0441e45da763]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.283 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6122d9e8-71e7-4a5c-8247-5ecbb168c82f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.299 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c9a4e9-bce9-4509-915f-0d4fac84b360]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382560, 'reachable_time': 24820, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215613, 'error': None, 'target': 'ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:41 np0005538960 systemd[1]: run-netns-ovnmeta\x2de779c78f\x2d4948\x2d46c1\x2da91a\x2d4b1068ceaae1.mount: Deactivated successfully.
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.303 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e779c78f-4948-46c1-a91a-4b1068ceaae1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:20:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:41.304 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[50ddea88-d965-4aa5-b017-6dfd47d480ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.968 187256 INFO nova.compute.manager [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Took 1.10 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.969 187256 DEBUG oslo.service.loopingcall [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.970 187256 DEBUG nova.compute.manager [-] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:20:41 np0005538960 nova_compute[187252]: 2025-11-28 16:20:41.970 187256 DEBUG nova.network.neutron [-] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.416 187256 DEBUG nova.compute.manager [req-71747a9d-5361-4ab5-8a1a-9b21a4a145ca req-586d531e-b44c-4804-9aba-8f43cdd5fd87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-vif-unplugged-6b498512-32dd-4e59-95bd-71c3a69bb44f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.417 187256 DEBUG oslo_concurrency.lockutils [req-71747a9d-5361-4ab5-8a1a-9b21a4a145ca req-586d531e-b44c-4804-9aba-8f43cdd5fd87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.417 187256 DEBUG oslo_concurrency.lockutils [req-71747a9d-5361-4ab5-8a1a-9b21a4a145ca req-586d531e-b44c-4804-9aba-8f43cdd5fd87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.417 187256 DEBUG oslo_concurrency.lockutils [req-71747a9d-5361-4ab5-8a1a-9b21a4a145ca req-586d531e-b44c-4804-9aba-8f43cdd5fd87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.417 187256 DEBUG nova.compute.manager [req-71747a9d-5361-4ab5-8a1a-9b21a4a145ca req-586d531e-b44c-4804-9aba-8f43cdd5fd87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] No waiting events found dispatching network-vif-unplugged-6b498512-32dd-4e59-95bd-71c3a69bb44f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.417 187256 DEBUG nova.compute.manager [req-71747a9d-5361-4ab5-8a1a-9b21a4a145ca req-586d531e-b44c-4804-9aba-8f43cdd5fd87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-vif-unplugged-6b498512-32dd-4e59-95bd-71c3a69bb44f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.435 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.516 187256 DEBUG nova.compute.manager [req-a843a87d-3a74-457e-88ee-bab6cba8fba7 req-2332a2c3-01b4-40fc-8f1b-eaf276935acf 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-changed-6b498512-32dd-4e59-95bd-71c3a69bb44f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.517 187256 DEBUG nova.compute.manager [req-a843a87d-3a74-457e-88ee-bab6cba8fba7 req-2332a2c3-01b4-40fc-8f1b-eaf276935acf 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Refreshing instance network info cache due to event network-changed-6b498512-32dd-4e59-95bd-71c3a69bb44f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.517 187256 DEBUG oslo_concurrency.lockutils [req-a843a87d-3a74-457e-88ee-bab6cba8fba7 req-2332a2c3-01b4-40fc-8f1b-eaf276935acf 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.518 187256 DEBUG oslo_concurrency.lockutils [req-a843a87d-3a74-457e-88ee-bab6cba8fba7 req-2332a2c3-01b4-40fc-8f1b-eaf276935acf 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:20:42 np0005538960 nova_compute[187252]: 2025-11-28 16:20:42.518 187256 DEBUG nova.network.neutron [req-a843a87d-3a74-457e-88ee-bab6cba8fba7 req-2332a2c3-01b4-40fc-8f1b-eaf276935acf 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Refreshing network info cache for port 6b498512-32dd-4e59-95bd-71c3a69bb44f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:20:44 np0005538960 nova_compute[187252]: 2025-11-28 16:20:44.974 187256 DEBUG nova.compute.manager [req-8432dca7-32c6-4622-8e25-82584066ab1c req-bfe464de-179c-415e-93ff-6c3c51c052b8 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:44 np0005538960 nova_compute[187252]: 2025-11-28 16:20:44.975 187256 DEBUG oslo_concurrency.lockutils [req-8432dca7-32c6-4622-8e25-82584066ab1c req-bfe464de-179c-415e-93ff-6c3c51c052b8 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:44 np0005538960 nova_compute[187252]: 2025-11-28 16:20:44.975 187256 DEBUG oslo_concurrency.lockutils [req-8432dca7-32c6-4622-8e25-82584066ab1c req-bfe464de-179c-415e-93ff-6c3c51c052b8 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:44 np0005538960 nova_compute[187252]: 2025-11-28 16:20:44.976 187256 DEBUG oslo_concurrency.lockutils [req-8432dca7-32c6-4622-8e25-82584066ab1c req-bfe464de-179c-415e-93ff-6c3c51c052b8 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:44 np0005538960 nova_compute[187252]: 2025-11-28 16:20:44.976 187256 DEBUG nova.compute.manager [req-8432dca7-32c6-4622-8e25-82584066ab1c req-bfe464de-179c-415e-93ff-6c3c51c052b8 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] No waiting events found dispatching network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:20:44 np0005538960 nova_compute[187252]: 2025-11-28 16:20:44.976 187256 WARNING nova.compute.manager [req-8432dca7-32c6-4622-8e25-82584066ab1c req-bfe464de-179c-415e-93ff-6c3c51c052b8 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received unexpected event network-vif-plugged-6b498512-32dd-4e59-95bd-71c3a69bb44f for instance with vm_state active and task_state deleting.#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.327 187256 DEBUG nova.network.neutron [-] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.349 187256 INFO nova.compute.manager [-] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Took 3.38 seconds to deallocate network for instance.#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.470 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.471 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.564 187256 DEBUG nova.compute.provider_tree [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.586 187256 DEBUG nova.scheduler.client.report [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.620 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.651 187256 INFO nova.scheduler.client.report [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Deleted allocations for instance eff28834-4c5b-46d0-90a8-4be63b9fff80#033[00m
Nov 28 11:20:45 np0005538960 nova_compute[187252]: 2025-11-28 16:20:45.733 187256 DEBUG oslo_concurrency.lockutils [None req-2f2151ce-e3e6-401a-9c91-63868fb6ca73 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "eff28834-4c5b-46d0-90a8-4be63b9fff80" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:20:45 np0005538960 podman[215618]: 2025-11-28 16:20:45.857951274 +0000 UTC m=+0.068017177 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 11:20:46 np0005538960 nova_compute[187252]: 2025-11-28 16:20:46.178 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:47 np0005538960 nova_compute[187252]: 2025-11-28 16:20:47.070 187256 DEBUG nova.network.neutron [req-a843a87d-3a74-457e-88ee-bab6cba8fba7 req-2332a2c3-01b4-40fc-8f1b-eaf276935acf 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updated VIF entry in instance network info cache for port 6b498512-32dd-4e59-95bd-71c3a69bb44f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:20:47 np0005538960 nova_compute[187252]: 2025-11-28 16:20:47.070 187256 DEBUG nova.network.neutron [req-a843a87d-3a74-457e-88ee-bab6cba8fba7 req-2332a2c3-01b4-40fc-8f1b-eaf276935acf 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Updating instance_info_cache with network_info: [{"id": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "address": "fa:16:3e:f3:00:db", "network": {"id": "e779c78f-4948-46c1-a91a-4b1068ceaae1", "bridge": "br-int", "label": "tempest-network-smoke--1133784818", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b498512-32", "ovs_interfaceid": "6b498512-32dd-4e59-95bd-71c3a69bb44f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:20:47 np0005538960 nova_compute[187252]: 2025-11-28 16:20:47.119 187256 DEBUG oslo_concurrency.lockutils [req-a843a87d-3a74-457e-88ee-bab6cba8fba7 req-2332a2c3-01b4-40fc-8f1b-eaf276935acf 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-eff28834-4c5b-46d0-90a8-4be63b9fff80" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:20:47 np0005538960 nova_compute[187252]: 2025-11-28 16:20:47.147 187256 DEBUG nova.compute.manager [req-5e53a521-47ba-4821-a3d5-76cc066ac33e req-372483d9-74a8-49d7-9b02-df1189e9d829 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Received event network-vif-deleted-6b498512-32dd-4e59-95bd-71c3a69bb44f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:20:47 np0005538960 nova_compute[187252]: 2025-11-28 16:20:47.438 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:49 np0005538960 podman[215638]: 2025-11-28 16:20:49.175952635 +0000 UTC m=+0.084611992 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:20:51 np0005538960 nova_compute[187252]: 2025-11-28 16:20:51.183 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:51 np0005538960 nova_compute[187252]: 2025-11-28 16:20:51.400 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:52 np0005538960 nova_compute[187252]: 2025-11-28 16:20:52.441 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:55 np0005538960 podman[215665]: 2025-11-28 16:20:55.190357767 +0000 UTC m=+0.095196850 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:20:56 np0005538960 nova_compute[187252]: 2025-11-28 16:20:56.140 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764346841.1385808, eff28834-4c5b-46d0-90a8-4be63b9fff80 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:20:56 np0005538960 nova_compute[187252]: 2025-11-28 16:20:56.141 187256 INFO nova.compute.manager [-] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:20:56 np0005538960 nova_compute[187252]: 2025-11-28 16:20:56.171 187256 DEBUG nova.compute.manager [None req-31d4f1ea-a8e8-4a4a-be08-fef04cb36519 - - - - - -] [instance: eff28834-4c5b-46d0-90a8-4be63b9fff80] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:20:56 np0005538960 nova_compute[187252]: 2025-11-28 16:20:56.186 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:57 np0005538960 nova_compute[187252]: 2025-11-28 16:20:57.443 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:59 np0005538960 podman[215693]: 2025-11-28 16:20:59.155090467 +0000 UTC m=+0.054594791 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 11:20:59 np0005538960 podman[215692]: 2025-11-28 16:20:59.163138653 +0000 UTC m=+0.065535137 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:20:59 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:59.518 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:20:59 np0005538960 nova_compute[187252]: 2025-11-28 16:20:59.519 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:20:59 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:20:59.521 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:21:01 np0005538960 nova_compute[187252]: 2025-11-28 16:21:01.242 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:02 np0005538960 nova_compute[187252]: 2025-11-28 16:21:02.227 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:02 np0005538960 nova_compute[187252]: 2025-11-28 16:21:02.545 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:02 np0005538960 nova_compute[187252]: 2025-11-28 16:21:02.569 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:05 np0005538960 podman[215731]: 2025-11-28 16:21:05.19618942 +0000 UTC m=+0.090546677 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:21:06 np0005538960 nova_compute[187252]: 2025-11-28 16:21:06.250 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:06.340 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:06.341 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:06.341 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:06.524 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:21:07 np0005538960 nova_compute[187252]: 2025-11-28 16:21:07.547 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:07 np0005538960 nova_compute[187252]: 2025-11-28 16:21:07.625 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "0c041a2f-082d-4b83-a004-33444bbe346a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:07 np0005538960 nova_compute[187252]: 2025-11-28 16:21:07.625 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:07 np0005538960 nova_compute[187252]: 2025-11-28 16:21:07.682 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:21:08 np0005538960 podman[215755]: 2025-11-28 16:21:08.1508486 +0000 UTC m=+0.055817190 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., architecture=x86_64, release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, maintainer=Red Hat, Inc.)
Nov 28 11:21:08 np0005538960 nova_compute[187252]: 2025-11-28 16:21:08.362 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:08 np0005538960 nova_compute[187252]: 2025-11-28 16:21:08.363 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:08 np0005538960 nova_compute[187252]: 2025-11-28 16:21:08.372 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:21:08 np0005538960 nova_compute[187252]: 2025-11-28 16:21:08.372 187256 INFO nova.compute.claims [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:21:08 np0005538960 nova_compute[187252]: 2025-11-28 16:21:08.687 187256 DEBUG nova.compute.provider_tree [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:21:08 np0005538960 nova_compute[187252]: 2025-11-28 16:21:08.709 187256 DEBUG nova.scheduler.client.report [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:21:08 np0005538960 nova_compute[187252]: 2025-11-28 16:21:08.744 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.381s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:08 np0005538960 nova_compute[187252]: 2025-11-28 16:21:08.746 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.132 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.133 187256 DEBUG nova.network.neutron [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.297 187256 INFO nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.404 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.514 187256 DEBUG nova.policy [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ae30b34fc1a34e0f9ac06baa690e2e19', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '20010402228a4b3d84e084ca5186ab15', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.641 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.642 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.643 187256 INFO nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Creating image(s)#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.643 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "/var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.644 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "/var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.644 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "/var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.656 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.719 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.720 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.721 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.732 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.798 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.800 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.850 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.852 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.852 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.908 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.909 187256 DEBUG nova.virt.disk.api [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Checking if we can resize image /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.910 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.967 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.968 187256 DEBUG nova.virt.disk.api [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Cannot resize image /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.969 187256 DEBUG nova.objects.instance [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lazy-loading 'migration_context' on Instance uuid 0c041a2f-082d-4b83-a004-33444bbe346a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.990 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.991 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Ensure instance console log exists: /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.991 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.991 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:09 np0005538960 nova_compute[187252]: 2025-11-28 16:21:09.992 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:11 np0005538960 nova_compute[187252]: 2025-11-28 16:21:11.255 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:12 np0005538960 nova_compute[187252]: 2025-11-28 16:21:12.549 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:13 np0005538960 nova_compute[187252]: 2025-11-28 16:21:13.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:13 np0005538960 nova_compute[187252]: 2025-11-28 16:21:13.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.338 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.360 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.360 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.360 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.361 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.379 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.380 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.380 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.380 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.564 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.565 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5783MB free_disk=73.34226608276367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.565 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.566 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.690 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance 0c041a2f-082d-4b83-a004-33444bbe346a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.690 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.691 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.850 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:21:15 np0005538960 nova_compute[187252]: 2025-11-28 16:21:15.862 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:21:16 np0005538960 podman[215792]: 2025-11-28 16:21:16.154207404 +0000 UTC m=+0.056862245 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 11:21:16 np0005538960 nova_compute[187252]: 2025-11-28 16:21:16.308 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:16 np0005538960 nova_compute[187252]: 2025-11-28 16:21:16.657 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:21:16 np0005538960 nova_compute[187252]: 2025-11-28 16:21:16.657 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:16 np0005538960 nova_compute[187252]: 2025-11-28 16:21:16.658 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:17 np0005538960 nova_compute[187252]: 2025-11-28 16:21:17.327 187256 DEBUG nova.network.neutron [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Successfully created port: db77c8b0-6e5e-4337-9535-4844aeb1612e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:21:17 np0005538960 nova_compute[187252]: 2025-11-28 16:21:17.551 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:17 np0005538960 nova_compute[187252]: 2025-11-28 16:21:17.626 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:17 np0005538960 nova_compute[187252]: 2025-11-28 16:21:17.626 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:18 np0005538960 nova_compute[187252]: 2025-11-28 16:21:18.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:18 np0005538960 nova_compute[187252]: 2025-11-28 16:21:18.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:21:18 np0005538960 nova_compute[187252]: 2025-11-28 16:21:18.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:21:18 np0005538960 nova_compute[187252]: 2025-11-28 16:21:18.330 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 28 11:21:18 np0005538960 nova_compute[187252]: 2025-11-28 16:21:18.330 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:21:18 np0005538960 nova_compute[187252]: 2025-11-28 16:21:18.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:18 np0005538960 nova_compute[187252]: 2025-11-28 16:21:18.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:19 np0005538960 nova_compute[187252]: 2025-11-28 16:21:19.326 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:20 np0005538960 podman[215814]: 2025-11-28 16:21:20.171033635 +0000 UTC m=+0.063420976 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:21:21 np0005538960 nova_compute[187252]: 2025-11-28 16:21:21.311 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:21 np0005538960 nova_compute[187252]: 2025-11-28 16:21:21.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:21:21 np0005538960 nova_compute[187252]: 2025-11-28 16:21:21.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 11:21:21 np0005538960 nova_compute[187252]: 2025-11-28 16:21:21.349 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 11:21:22 np0005538960 nova_compute[187252]: 2025-11-28 16:21:22.553 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:24 np0005538960 nova_compute[187252]: 2025-11-28 16:21:24.494 187256 DEBUG nova.network.neutron [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Successfully updated port: db77c8b0-6e5e-4337-9535-4844aeb1612e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:21:26 np0005538960 podman[215839]: 2025-11-28 16:21:26.212063346 +0000 UTC m=+0.111888146 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:21:26 np0005538960 nova_compute[187252]: 2025-11-28 16:21:26.313 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:26 np0005538960 nova_compute[187252]: 2025-11-28 16:21:26.947 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:21:26 np0005538960 nova_compute[187252]: 2025-11-28 16:21:26.948 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquired lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:21:26 np0005538960 nova_compute[187252]: 2025-11-28 16:21:26.948 187256 DEBUG nova.network.neutron [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:21:27 np0005538960 nova_compute[187252]: 2025-11-28 16:21:27.555 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:28 np0005538960 nova_compute[187252]: 2025-11-28 16:21:28.568 187256 DEBUG nova.compute.manager [req-063639ba-db96-4f2e-ae08-6419b2fcf04d req-557dee82-3ec4-45b2-8c67-16275ecb7520 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Received event network-changed-db77c8b0-6e5e-4337-9535-4844aeb1612e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:21:28 np0005538960 nova_compute[187252]: 2025-11-28 16:21:28.569 187256 DEBUG nova.compute.manager [req-063639ba-db96-4f2e-ae08-6419b2fcf04d req-557dee82-3ec4-45b2-8c67-16275ecb7520 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Refreshing instance network info cache due to event network-changed-db77c8b0-6e5e-4337-9535-4844aeb1612e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:21:28 np0005538960 nova_compute[187252]: 2025-11-28 16:21:28.569 187256 DEBUG oslo_concurrency.lockutils [req-063639ba-db96-4f2e-ae08-6419b2fcf04d req-557dee82-3ec4-45b2-8c67-16275ecb7520 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:21:28 np0005538960 nova_compute[187252]: 2025-11-28 16:21:28.805 187256 DEBUG nova.network.neutron [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:21:30 np0005538960 podman[215867]: 2025-11-28 16:21:30.160370087 +0000 UTC m=+0.065587159 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 11:21:30 np0005538960 podman[215866]: 2025-11-28 16:21:30.168510435 +0000 UTC m=+0.078536403 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:21:31 np0005538960 nova_compute[187252]: 2025-11-28 16:21:31.317 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:32 np0005538960 nova_compute[187252]: 2025-11-28 16:21:32.558 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.272 187256 DEBUG nova.network.neutron [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Updating instance_info_cache with network_info: [{"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.307 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Releasing lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.308 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Instance network_info: |[{"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.308 187256 DEBUG oslo_concurrency.lockutils [req-063639ba-db96-4f2e-ae08-6419b2fcf04d req-557dee82-3ec4-45b2-8c67-16275ecb7520 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.308 187256 DEBUG nova.network.neutron [req-063639ba-db96-4f2e-ae08-6419b2fcf04d req-557dee82-3ec4-45b2-8c67-16275ecb7520 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Refreshing network info cache for port db77c8b0-6e5e-4337-9535-4844aeb1612e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.313 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Start _get_guest_xml network_info=[{"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.318 187256 WARNING nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.332 187256 DEBUG nova.virt.libvirt.host [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.333 187256 DEBUG nova.virt.libvirt.host [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.341 187256 DEBUG nova.virt.libvirt.host [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.342 187256 DEBUG nova.virt.libvirt.host [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.343 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.343 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.344 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.344 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.345 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.345 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.345 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.346 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.346 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.346 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.347 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.347 187256 DEBUG nova.virt.hardware [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.352 187256 DEBUG nova.virt.libvirt.vif [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-436562832',display_name='tempest-TestServerBasicOps-server-436562832',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-436562832',id=12,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4SUgJ/3SuIoocsAypVszwRFdJaTlBuNvnp5NUiZnz7NVCvXMPP3FOAZ5KHyay41JxUgLQw9/27RSv4yqOe1N4u4u/BjrDfnxYWONPYeUleQ/pUShAAGr9rYjCK4tcgSg==',key_name='tempest-TestServerBasicOps-1196511644',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20010402228a4b3d84e084ca5186ab15',ramdisk_id='',reservation_id='r-si9fpwjg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1659108837',owner_user_name='tempest-TestServerBasicOps-1659108837-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:21:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ae30b34fc1a34e0f9ac06baa690e2e19',uuid=0c041a2f-082d-4b83-a004-33444bbe346a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.352 187256 DEBUG nova.network.os_vif_util [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Converting VIF {"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.353 187256 DEBUG nova.network.os_vif_util [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:1c:c9,bridge_name='br-int',has_traffic_filtering=True,id=db77c8b0-6e5e-4337-9535-4844aeb1612e,network=Network(a4639934-cd6c-4019-971f-cbfc09a0bb49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb77c8b0-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.354 187256 DEBUG nova.objects.instance [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0c041a2f-082d-4b83-a004-33444bbe346a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.368 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <uuid>0c041a2f-082d-4b83-a004-33444bbe346a</uuid>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <name>instance-0000000c</name>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestServerBasicOps-server-436562832</nova:name>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:21:33</nova:creationTime>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        <nova:user uuid="ae30b34fc1a34e0f9ac06baa690e2e19">tempest-TestServerBasicOps-1659108837-project-member</nova:user>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        <nova:project uuid="20010402228a4b3d84e084ca5186ab15">tempest-TestServerBasicOps-1659108837</nova:project>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        <nova:port uuid="db77c8b0-6e5e-4337-9535-4844aeb1612e">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <entry name="serial">0c041a2f-082d-4b83-a004-33444bbe346a</entry>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <entry name="uuid">0c041a2f-082d-4b83-a004-33444bbe346a</entry>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk.config"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:7a:1c:c9"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <target dev="tapdb77c8b0-6e"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/console.log" append="off"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:21:33 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:21:33 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:21:33 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:21:33 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.370 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Preparing to wait for external event network-vif-plugged-db77c8b0-6e5e-4337-9535-4844aeb1612e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.370 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.370 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.371 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.372 187256 DEBUG nova.virt.libvirt.vif [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-436562832',display_name='tempest-TestServerBasicOps-server-436562832',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-436562832',id=12,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4SUgJ/3SuIoocsAypVszwRFdJaTlBuNvnp5NUiZnz7NVCvXMPP3FOAZ5KHyay41JxUgLQw9/27RSv4yqOe1N4u4u/BjrDfnxYWONPYeUleQ/pUShAAGr9rYjCK4tcgSg==',key_name='tempest-TestServerBasicOps-1196511644',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='20010402228a4b3d84e084ca5186ab15',ramdisk_id='',reservation_id='r-si9fpwjg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerBasicOps-1659108837',owner_user_name='tempest-TestServerBasicOps-1659108837-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:21:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ae30b34fc1a34e0f9ac06baa690e2e19',uuid=0c041a2f-082d-4b83-a004-33444bbe346a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.372 187256 DEBUG nova.network.os_vif_util [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Converting VIF {"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.373 187256 DEBUG nova.network.os_vif_util [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7a:1c:c9,bridge_name='br-int',has_traffic_filtering=True,id=db77c8b0-6e5e-4337-9535-4844aeb1612e,network=Network(a4639934-cd6c-4019-971f-cbfc09a0bb49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb77c8b0-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.374 187256 DEBUG os_vif [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:1c:c9,bridge_name='br-int',has_traffic_filtering=True,id=db77c8b0-6e5e-4337-9535-4844aeb1612e,network=Network(a4639934-cd6c-4019-971f-cbfc09a0bb49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb77c8b0-6e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.374 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.375 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.375 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.378 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.378 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdb77c8b0-6e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.379 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdb77c8b0-6e, col_values=(('external_ids', {'iface-id': 'db77c8b0-6e5e-4337-9535-4844aeb1612e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7a:1c:c9', 'vm-uuid': '0c041a2f-082d-4b83-a004-33444bbe346a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.381 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:33 np0005538960 NetworkManager[55548]: <info>  [1764346893.3823] manager: (tapdb77c8b0-6e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.383 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.389 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.390 187256 INFO os_vif [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7a:1c:c9,bridge_name='br-int',has_traffic_filtering=True,id=db77c8b0-6e5e-4337-9535-4844aeb1612e,network=Network(a4639934-cd6c-4019-971f-cbfc09a0bb49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb77c8b0-6e')#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.434 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.435 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.435 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] No VIF found with MAC fa:16:3e:7a:1c:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:21:33 np0005538960 nova_compute[187252]: 2025-11-28 16:21:33.436 187256 INFO nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Using config drive#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.190 187256 INFO nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Creating config drive at /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk.config#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.198 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4corx_qc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.330 187256 DEBUG oslo_concurrency.processutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4corx_qc" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:21:34 np0005538960 kernel: tapdb77c8b0-6e: entered promiscuous mode
Nov 28 11:21:34 np0005538960 ovn_controller[95460]: 2025-11-28T16:21:34Z|00058|binding|INFO|Claiming lport db77c8b0-6e5e-4337-9535-4844aeb1612e for this chassis.
Nov 28 11:21:34 np0005538960 ovn_controller[95460]: 2025-11-28T16:21:34Z|00059|binding|INFO|db77c8b0-6e5e-4337-9535-4844aeb1612e: Claiming fa:16:3e:7a:1c:c9 10.100.0.11
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.410 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.413 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:34 np0005538960 NetworkManager[55548]: <info>  [1764346894.4150] manager: (tapdb77c8b0-6e): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Nov 28 11:21:34 np0005538960 systemd-udevd[215921]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:21:34 np0005538960 NetworkManager[55548]: <info>  [1764346894.4546] device (tapdb77c8b0-6e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:21:34 np0005538960 NetworkManager[55548]: <info>  [1764346894.4554] device (tapdb77c8b0-6e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:21:34 np0005538960 systemd-machined[153518]: New machine qemu-4-instance-0000000c.
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.467 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:34 np0005538960 ovn_controller[95460]: 2025-11-28T16:21:34Z|00060|binding|INFO|Setting lport db77c8b0-6e5e-4337-9535-4844aeb1612e ovn-installed in OVS
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.474 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:34 np0005538960 systemd[1]: Started Virtual Machine qemu-4-instance-0000000c.
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.716 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346894.715751, 0c041a2f-082d-4b83-a004-33444bbe346a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.717 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] VM Started (Lifecycle Event)#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.744 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.748 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346894.7169502, 0c041a2f-082d-4b83-a004-33444bbe346a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.749 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.769 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.774 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:21:34 np0005538960 nova_compute[187252]: 2025-11-28 16:21:34.795 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:21:35 np0005538960 ovn_controller[95460]: 2025-11-28T16:21:35Z|00061|binding|INFO|Setting lport db77c8b0-6e5e-4337-9535-4844aeb1612e up in Southbound
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.523 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:1c:c9 10.100.0.11'], port_security=['fa:16:3e:7a:1c:c9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '0c041a2f-082d-4b83-a004-33444bbe346a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4639934-cd6c-4019-971f-cbfc09a0bb49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20010402228a4b3d84e084ca5186ab15', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8af4c696-b929-46b1-9e0f-5694add283af c364b436-0288-43e1-bfce-1f6cd77763be', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=949f675f-c948-4974-951c-86031ef50831, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=db77c8b0-6e5e-4337-9535-4844aeb1612e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.524 104369 INFO neutron.agent.ovn.metadata.agent [-] Port db77c8b0-6e5e-4337-9535-4844aeb1612e in datapath a4639934-cd6c-4019-971f-cbfc09a0bb49 bound to our chassis#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.527 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a4639934-cd6c-4019-971f-cbfc09a0bb49#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.541 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[76b60966-48f0-4983-a2c3-941711eb9850]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.542 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa4639934-c1 in ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.544 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa4639934-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.545 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ffc35e74-f0e1-4842-b54d-552c2afaedba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.546 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5088f1e7-7650-4c24-9ec7-160637d02ae2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.559 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[a5aae919-28f5-4fa2-915a-5b6e34963dab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.588 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba8e101-26ce-4a5a-a7ce-891e857bc1d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.625 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[692919f7-bce8-49e3-8f4a-e72f6ae82ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 systemd-udevd[215926]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.633 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[92d20273-afaf-4fc4-b9cb-60177fd360b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 NetworkManager[55548]: <info>  [1764346895.6367] manager: (tapa4639934-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Nov 28 11:21:35 np0005538960 podman[215942]: 2025-11-28 16:21:35.662930091 +0000 UTC m=+0.071220556 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.666 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[35c7b271-2860-40f3-8496-9db183e39491]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.669 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[ce78aace-0d37-485f-80d6-a7aee7bc3849]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 NetworkManager[55548]: <info>  [1764346895.6919] device (tapa4639934-c0): carrier: link connected
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.697 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[98bc0365-bf39-4945-9390-04ef583ad74d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.715 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5e2b67f1-6796-44e0-903c-502d9721e0a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa4639934-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:df:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399527, 'reachable_time': 44564, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215986, 'error': None, 'target': 'ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.731 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f04309-bc21-4d79-ba35-5c6844848327]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed3:df1a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399527, 'tstamp': 399527}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215987, 'error': None, 'target': 'ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.749 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[84a0653b-ef77-425d-a0a8-0366f23dff52]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa4639934-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d3:df:1a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399527, 'reachable_time': 44564, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215988, 'error': None, 'target': 'ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.780 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[694df20b-6c65-43cb-b6ba-73c33b766697]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.850 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4afbb7-85eb-4e8a-ba00-64c67769011a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.851 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4639934-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.852 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.852 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4639934-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:21:35 np0005538960 nova_compute[187252]: 2025-11-28 16:21:35.854 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:35 np0005538960 NetworkManager[55548]: <info>  [1764346895.8551] manager: (tapa4639934-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Nov 28 11:21:35 np0005538960 kernel: tapa4639934-c0: entered promiscuous mode
Nov 28 11:21:35 np0005538960 nova_compute[187252]: 2025-11-28 16:21:35.856 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.858 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa4639934-c0, col_values=(('external_ids', {'iface-id': '88fb46f0-2118-4faf-9b0a-683e4080150e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:21:35 np0005538960 nova_compute[187252]: 2025-11-28 16:21:35.859 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:35 np0005538960 ovn_controller[95460]: 2025-11-28T16:21:35Z|00062|binding|INFO|Releasing lport 88fb46f0-2118-4faf-9b0a-683e4080150e from this chassis (sb_readonly=0)
Nov 28 11:21:35 np0005538960 nova_compute[187252]: 2025-11-28 16:21:35.871 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.872 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a4639934-cd6c-4019-971f-cbfc09a0bb49.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a4639934-cd6c-4019-971f-cbfc09a0bb49.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.873 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ab310c3f-c9d7-474d-845d-5195f41964fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.874 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-a4639934-cd6c-4019-971f-cbfc09a0bb49
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/a4639934-cd6c-4019-971f-cbfc09a0bb49.pid.haproxy
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID a4639934-cd6c-4019-971f-cbfc09a0bb49
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:21:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:21:35.875 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49', 'env', 'PROCESS_TAG=haproxy-a4639934-cd6c-4019-971f-cbfc09a0bb49', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a4639934-cd6c-4019-971f-cbfc09a0bb49.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:21:36 np0005538960 podman[216021]: 2025-11-28 16:21:36.292081824 +0000 UTC m=+0.071240716 container create 15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:21:36 np0005538960 systemd[1]: Started libpod-conmon-15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f.scope.
Nov 28 11:21:36 np0005538960 podman[216021]: 2025-11-28 16:21:36.253629408 +0000 UTC m=+0.032788330 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:21:36 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:21:36 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b96b7b28316a76a0ef526b65aa81e8f333bd94e80e19e4c9ca5f15676f166613/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:21:36 np0005538960 podman[216021]: 2025-11-28 16:21:36.384535216 +0000 UTC m=+0.163694168 container init 15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:21:36 np0005538960 podman[216021]: 2025-11-28 16:21:36.391039394 +0000 UTC m=+0.170198296 container start 15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:21:36 np0005538960 neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49[216036]: [NOTICE]   (216040) : New worker (216042) forked
Nov 28 11:21:36 np0005538960 neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49[216036]: [NOTICE]   (216040) : Loading success.
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.415 187256 DEBUG nova.compute.manager [req-47819121-e7cc-4c6e-ba59-736f7a25e3d6 req-bc6c754c-a7f8-42b9-a9fb-196c37eee40c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Received event network-vif-plugged-db77c8b0-6e5e-4337-9535-4844aeb1612e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.416 187256 DEBUG oslo_concurrency.lockutils [req-47819121-e7cc-4c6e-ba59-736f7a25e3d6 req-bc6c754c-a7f8-42b9-a9fb-196c37eee40c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.416 187256 DEBUG oslo_concurrency.lockutils [req-47819121-e7cc-4c6e-ba59-736f7a25e3d6 req-bc6c754c-a7f8-42b9-a9fb-196c37eee40c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.417 187256 DEBUG oslo_concurrency.lockutils [req-47819121-e7cc-4c6e-ba59-736f7a25e3d6 req-bc6c754c-a7f8-42b9-a9fb-196c37eee40c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.417 187256 DEBUG nova.compute.manager [req-47819121-e7cc-4c6e-ba59-736f7a25e3d6 req-bc6c754c-a7f8-42b9-a9fb-196c37eee40c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Processing event network-vif-plugged-db77c8b0-6e5e-4337-9535-4844aeb1612e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.418 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.422 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346897.4221113, 0c041a2f-082d-4b83-a004-33444bbe346a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.422 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.425 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.429 187256 INFO nova.virt.libvirt.driver [-] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Instance spawned successfully.#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.430 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.471 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.472 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.473 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.473 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.474 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.474 187256 DEBUG nova.virt.libvirt.driver [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.480 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.486 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.514 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.606 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.675 187256 INFO nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Took 28.03 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.676 187256 DEBUG nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.797 187256 INFO nova.compute.manager [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Took 29.48 seconds to build instance.#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.892 187256 DEBUG oslo_concurrency.lockutils [None req-f5544e15-4e7c-4202-9902-d4d10e82cbad ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 30.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.932 187256 DEBUG nova.network.neutron [req-063639ba-db96-4f2e-ae08-6419b2fcf04d req-557dee82-3ec4-45b2-8c67-16275ecb7520 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Updated VIF entry in instance network info cache for port db77c8b0-6e5e-4337-9535-4844aeb1612e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:21:37 np0005538960 nova_compute[187252]: 2025-11-28 16:21:37.933 187256 DEBUG nova.network.neutron [req-063639ba-db96-4f2e-ae08-6419b2fcf04d req-557dee82-3ec4-45b2-8c67-16275ecb7520 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Updating instance_info_cache with network_info: [{"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:21:38 np0005538960 nova_compute[187252]: 2025-11-28 16:21:38.351 187256 DEBUG oslo_concurrency.lockutils [req-063639ba-db96-4f2e-ae08-6419b2fcf04d req-557dee82-3ec4-45b2-8c67-16275ecb7520 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:21:38 np0005538960 nova_compute[187252]: 2025-11-28 16:21:38.382 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:39 np0005538960 podman[216051]: 2025-11-28 16:21:39.202673842 +0000 UTC m=+0.097311662 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 11:21:39 np0005538960 nova_compute[187252]: 2025-11-28 16:21:39.549 187256 DEBUG nova.compute.manager [req-f5e410c4-b9c3-4e26-8caf-85292bf2671a req-f4b0c20c-1d08-4a7e-8d40-4610b0ca59ea 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Received event network-vif-plugged-db77c8b0-6e5e-4337-9535-4844aeb1612e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:21:39 np0005538960 nova_compute[187252]: 2025-11-28 16:21:39.550 187256 DEBUG oslo_concurrency.lockutils [req-f5e410c4-b9c3-4e26-8caf-85292bf2671a req-f4b0c20c-1d08-4a7e-8d40-4610b0ca59ea 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:21:39 np0005538960 nova_compute[187252]: 2025-11-28 16:21:39.550 187256 DEBUG oslo_concurrency.lockutils [req-f5e410c4-b9c3-4e26-8caf-85292bf2671a req-f4b0c20c-1d08-4a7e-8d40-4610b0ca59ea 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:21:39 np0005538960 nova_compute[187252]: 2025-11-28 16:21:39.550 187256 DEBUG oslo_concurrency.lockutils [req-f5e410c4-b9c3-4e26-8caf-85292bf2671a req-f4b0c20c-1d08-4a7e-8d40-4610b0ca59ea 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:21:39 np0005538960 nova_compute[187252]: 2025-11-28 16:21:39.550 187256 DEBUG nova.compute.manager [req-f5e410c4-b9c3-4e26-8caf-85292bf2671a req-f4b0c20c-1d08-4a7e-8d40-4610b0ca59ea 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] No waiting events found dispatching network-vif-plugged-db77c8b0-6e5e-4337-9535-4844aeb1612e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:21:39 np0005538960 nova_compute[187252]: 2025-11-28 16:21:39.551 187256 WARNING nova.compute.manager [req-f5e410c4-b9c3-4e26-8caf-85292bf2671a req-f4b0c20c-1d08-4a7e-8d40-4610b0ca59ea 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Received unexpected event network-vif-plugged-db77c8b0-6e5e-4337-9535-4844aeb1612e for instance with vm_state active and task_state None.#033[00m
Nov 28 11:21:42 np0005538960 nova_compute[187252]: 2025-11-28 16:21:42.608 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:43 np0005538960 nova_compute[187252]: 2025-11-28 16:21:43.431 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:46 np0005538960 NetworkManager[55548]: <info>  [1764346906.5991] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Nov 28 11:21:46 np0005538960 NetworkManager[55548]: <info>  [1764346906.6002] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Nov 28 11:21:46 np0005538960 nova_compute[187252]: 2025-11-28 16:21:46.603 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:46 np0005538960 nova_compute[187252]: 2025-11-28 16:21:46.847 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:21:46Z|00063|binding|INFO|Releasing lport 88fb46f0-2118-4faf-9b0a-683e4080150e from this chassis (sb_readonly=0)
Nov 28 11:21:46 np0005538960 nova_compute[187252]: 2025-11-28 16:21:46.945 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:47 np0005538960 podman[216072]: 2025-11-28 16:21:47.188784616 +0000 UTC m=+0.090070956 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:21:47 np0005538960 nova_compute[187252]: 2025-11-28 16:21:47.609 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:47 np0005538960 nova_compute[187252]: 2025-11-28 16:21:47.984 187256 DEBUG nova.compute.manager [req-7b4dfd0e-5e01-4fcb-b06d-df7cf410195c req-02fdc1e6-d88a-44bb-88fa-16d00019dbe0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Received event network-changed-db77c8b0-6e5e-4337-9535-4844aeb1612e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:21:47 np0005538960 nova_compute[187252]: 2025-11-28 16:21:47.986 187256 DEBUG nova.compute.manager [req-7b4dfd0e-5e01-4fcb-b06d-df7cf410195c req-02fdc1e6-d88a-44bb-88fa-16d00019dbe0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Refreshing instance network info cache due to event network-changed-db77c8b0-6e5e-4337-9535-4844aeb1612e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:21:47 np0005538960 nova_compute[187252]: 2025-11-28 16:21:47.986 187256 DEBUG oslo_concurrency.lockutils [req-7b4dfd0e-5e01-4fcb-b06d-df7cf410195c req-02fdc1e6-d88a-44bb-88fa-16d00019dbe0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:21:47 np0005538960 nova_compute[187252]: 2025-11-28 16:21:47.987 187256 DEBUG oslo_concurrency.lockutils [req-7b4dfd0e-5e01-4fcb-b06d-df7cf410195c req-02fdc1e6-d88a-44bb-88fa-16d00019dbe0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:21:47 np0005538960 nova_compute[187252]: 2025-11-28 16:21:47.987 187256 DEBUG nova.network.neutron [req-7b4dfd0e-5e01-4fcb-b06d-df7cf410195c req-02fdc1e6-d88a-44bb-88fa-16d00019dbe0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Refreshing network info cache for port db77c8b0-6e5e-4337-9535-4844aeb1612e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:21:48 np0005538960 nova_compute[187252]: 2025-11-28 16:21:48.435 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:51 np0005538960 podman[216110]: 2025-11-28 16:21:51.183278502 +0000 UTC m=+0.084295124 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:21:51 np0005538960 nova_compute[187252]: 2025-11-28 16:21:51.360 187256 DEBUG nova.network.neutron [req-7b4dfd0e-5e01-4fcb-b06d-df7cf410195c req-02fdc1e6-d88a-44bb-88fa-16d00019dbe0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Updated VIF entry in instance network info cache for port db77c8b0-6e5e-4337-9535-4844aeb1612e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:21:51 np0005538960 nova_compute[187252]: 2025-11-28 16:21:51.361 187256 DEBUG nova.network.neutron [req-7b4dfd0e-5e01-4fcb-b06d-df7cf410195c req-02fdc1e6-d88a-44bb-88fa-16d00019dbe0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Updating instance_info_cache with network_info: [{"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:21:51 np0005538960 nova_compute[187252]: 2025-11-28 16:21:51.393 187256 DEBUG oslo_concurrency.lockutils [req-7b4dfd0e-5e01-4fcb-b06d-df7cf410195c req-02fdc1e6-d88a-44bb-88fa-16d00019dbe0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-0c041a2f-082d-4b83-a004-33444bbe346a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:21:52 np0005538960 ovn_controller[95460]: 2025-11-28T16:21:52Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7a:1c:c9 10.100.0.11
Nov 28 11:21:52 np0005538960 ovn_controller[95460]: 2025-11-28T16:21:52Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7a:1c:c9 10.100.0.11
Nov 28 11:21:52 np0005538960 nova_compute[187252]: 2025-11-28 16:21:52.611 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:53 np0005538960 nova_compute[187252]: 2025-11-28 16:21:53.438 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:57 np0005538960 podman[216135]: 2025-11-28 16:21:57.2250289 +0000 UTC m=+0.127950768 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 11:21:57 np0005538960 nova_compute[187252]: 2025-11-28 16:21:57.614 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:21:58 np0005538960 nova_compute[187252]: 2025-11-28 16:21:58.440 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:00 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:00Z|00064|binding|INFO|Releasing lport 88fb46f0-2118-4faf-9b0a-683e4080150e from this chassis (sb_readonly=0)
Nov 28 11:22:00 np0005538960 nova_compute[187252]: 2025-11-28 16:22:00.440 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:00 np0005538960 nova_compute[187252]: 2025-11-28 16:22:00.472 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Triggering sync for uuid 0c041a2f-082d-4b83-a004-33444bbe346a _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Nov 28 11:22:00 np0005538960 nova_compute[187252]: 2025-11-28 16:22:00.472 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "0c041a2f-082d-4b83-a004-33444bbe346a" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:00 np0005538960 nova_compute[187252]: 2025-11-28 16:22:00.472 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "0c041a2f-082d-4b83-a004-33444bbe346a" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:00 np0005538960 nova_compute[187252]: 2025-11-28 16:22:00.482 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:00 np0005538960 nova_compute[187252]: 2025-11-28 16:22:00.503 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "0c041a2f-082d-4b83-a004-33444bbe346a" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.030s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:01 np0005538960 podman[216166]: 2025-11-28 16:22:01.172235875 +0000 UTC m=+0.068539971 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 11:22:01 np0005538960 podman[216165]: 2025-11-28 16:22:01.172121602 +0000 UTC m=+0.068782306 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, org.label-schema.license=GPLv2)
Nov 28 11:22:01 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:01.235 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:22:01 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:01.236 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:22:01 np0005538960 nova_compute[187252]: 2025-11-28 16:22:01.236 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:02 np0005538960 nova_compute[187252]: 2025-11-28 16:22:02.617 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:03 np0005538960 nova_compute[187252]: 2025-11-28 16:22:03.443 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.097 187256 DEBUG nova.compute.manager [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.198 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.199 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.240 187256 DEBUG nova.objects.instance [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'pci_requests' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.256 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.256 187256 INFO nova.compute.claims [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.257 187256 DEBUG nova.objects.instance [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'resources' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.271 187256 DEBUG nova.objects.instance [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'numa_topology' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.279 187256 DEBUG nova.objects.instance [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'pci_devices' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.322 187256 INFO nova.compute.resource_tracker [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updating resource usage from migration 1a20dd0d-100a-4d04-8eae-d58fe52c58ab#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.323 187256 DEBUG nova.compute.resource_tracker [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Starting to track incoming migration 1a20dd0d-100a-4d04-8eae-d58fe52c58ab with flavor c90217bd-1e89-4c68-8e01-33bf1cee456c _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.408 187256 DEBUG nova.compute.provider_tree [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.419 187256 DEBUG nova.scheduler.client.report [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.439 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:04 np0005538960 nova_compute[187252]: 2025-11-28 16:22:04.440 187256 INFO nova.compute.manager [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Migrating#033[00m
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:04.466 104477 DEBUG eventlet.wsgi.server [-] (104477) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:04.468 104477 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /latest/meta-data/public-ipv4 HTTP/1.0#015
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: Accept: */*#015
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: Connection: close#015
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: Content-Type: text/plain#015
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: Host: 169.254.169.254#015
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: User-Agent: curl/7.84.0#015
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: X-Forwarded-For: 10.100.0.11#015
Nov 28 11:22:04 np0005538960 ovn_metadata_agent[104364]: X-Ovn-Network-Id: a4639934-cd6c-4019-971f-cbfc09a0bb49 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:05.834 104477 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:05.835 104477 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "GET /latest/meta-data/public-ipv4 HTTP/1.1" status: 200  len: 151 time: 1.3673954#033[00m
Nov 28 11:22:05 np0005538960 haproxy-metadata-proxy-a4639934-cd6c-4019-971f-cbfc09a0bb49[216042]: 10.100.0.11:48470 [28/Nov/2025:16:22:04.465] listener listener/metadata 0/0/0/1369/1369 200 135 - - ---- 1/1/0/0/0 0/0 "GET /latest/meta-data/public-ipv4 HTTP/1.1"
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:05.960 104477 DEBUG eventlet.wsgi.server [-] (104477) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:05.961 104477 DEBUG neutron.agent.ovn.metadata.server [-] Request: POST /openstack/2013-10-17/password HTTP/1.0#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: Accept: */*#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: Connection: close#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: Content-Length: 100#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: Content-Type: application/x-www-form-urlencoded#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: Host: 169.254.169.254#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: User-Agent: curl/7.84.0#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: X-Forwarded-For: 10.100.0.11#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: X-Ovn-Network-Id: a4639934-cd6c-4019-971f-cbfc09a0bb49#015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: #015
Nov 28 11:22:05 np0005538960 ovn_metadata_agent[104364]: testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Nov 28 11:22:06 np0005538960 podman[216204]: 2025-11-28 16:22:06.150246034 +0000 UTC m=+0.058368092 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:22:06 np0005538960 systemd[1]: Created slice User Slice of UID 42436.
Nov 28 11:22:06 np0005538960 systemd[1]: Starting User Runtime Directory /run/user/42436...
Nov 28 11:22:06 np0005538960 systemd-logind[788]: New session 34 of user nova.
Nov 28 11:22:06 np0005538960 haproxy-metadata-proxy-a4639934-cd6c-4019-971f-cbfc09a0bb49[216042]: 10.100.0.11:48482 [28/Nov/2025:16:22:05.959] listener listener/metadata 0/0/0/325/325 200 118 - - ---- 1/1/0/0/0 0/0 "POST /openstack/2013-10-17/password HTTP/1.1"
Nov 28 11:22:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:06.282 104477 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Nov 28 11:22:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:06.285 104477 INFO eventlet.wsgi.server [-] 10.100.0.11,<local> "POST /openstack/2013-10-17/password HTTP/1.1" status: 200  len: 134 time: 0.3233600#033[00m
Nov 28 11:22:06 np0005538960 systemd[1]: Finished User Runtime Directory /run/user/42436.
Nov 28 11:22:06 np0005538960 systemd[1]: Starting User Manager for UID 42436...
Nov 28 11:22:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:06.341 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:06.342 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:06.343 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:06 np0005538960 systemd[216234]: Queued start job for default target Main User Target.
Nov 28 11:22:06 np0005538960 systemd[216234]: Created slice User Application Slice.
Nov 28 11:22:06 np0005538960 systemd[216234]: Started Mark boot as successful after the user session has run 2 minutes.
Nov 28 11:22:06 np0005538960 systemd[216234]: Started Daily Cleanup of User's Temporary Directories.
Nov 28 11:22:06 np0005538960 systemd[216234]: Reached target Paths.
Nov 28 11:22:06 np0005538960 systemd[216234]: Reached target Timers.
Nov 28 11:22:06 np0005538960 systemd[216234]: Starting D-Bus User Message Bus Socket...
Nov 28 11:22:06 np0005538960 systemd[216234]: Starting Create User's Volatile Files and Directories...
Nov 28 11:22:06 np0005538960 systemd[216234]: Listening on D-Bus User Message Bus Socket.
Nov 28 11:22:06 np0005538960 systemd[216234]: Reached target Sockets.
Nov 28 11:22:06 np0005538960 systemd[216234]: Finished Create User's Volatile Files and Directories.
Nov 28 11:22:06 np0005538960 systemd[216234]: Reached target Basic System.
Nov 28 11:22:06 np0005538960 systemd[216234]: Reached target Main User Target.
Nov 28 11:22:06 np0005538960 systemd[216234]: Startup finished in 149ms.
Nov 28 11:22:06 np0005538960 systemd[1]: Started User Manager for UID 42436.
Nov 28 11:22:06 np0005538960 systemd[1]: Started Session 34 of User nova.
Nov 28 11:22:06 np0005538960 systemd[1]: session-34.scope: Deactivated successfully.
Nov 28 11:22:06 np0005538960 systemd-logind[788]: Session 34 logged out. Waiting for processes to exit.
Nov 28 11:22:06 np0005538960 systemd-logind[788]: Removed session 34.
Nov 28 11:22:06 np0005538960 systemd-logind[788]: New session 36 of user nova.
Nov 28 11:22:06 np0005538960 systemd[1]: Started Session 36 of User nova.
Nov 28 11:22:06 np0005538960 systemd[1]: session-36.scope: Deactivated successfully.
Nov 28 11:22:06 np0005538960 systemd-logind[788]: Session 36 logged out. Waiting for processes to exit.
Nov 28 11:22:06 np0005538960 systemd-logind[788]: Removed session 36.
Nov 28 11:22:07 np0005538960 nova_compute[187252]: 2025-11-28 16:22:07.058 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:07 np0005538960 nova_compute[187252]: 2025-11-28 16:22:07.687 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:08.239 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.450 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.621 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "0c041a2f-082d-4b83-a004-33444bbe346a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.622 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.622 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.622 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.623 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.624 187256 INFO nova.compute.manager [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Terminating instance#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.625 187256 DEBUG nova.compute.manager [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:22:08 np0005538960 kernel: tapdb77c8b0-6e (unregistering): left promiscuous mode
Nov 28 11:22:08 np0005538960 NetworkManager[55548]: <info>  [1764346928.6556] device (tapdb77c8b0-6e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:22:08 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:08Z|00065|binding|INFO|Releasing lport db77c8b0-6e5e-4337-9535-4844aeb1612e from this chassis (sb_readonly=0)
Nov 28 11:22:08 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:08Z|00066|binding|INFO|Setting lport db77c8b0-6e5e-4337-9535-4844aeb1612e down in Southbound
Nov 28 11:22:08 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:08Z|00067|binding|INFO|Removing iface tapdb77c8b0-6e ovn-installed in OVS
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.666 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.669 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:08.676 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7a:1c:c9 10.100.0.11'], port_security=['fa:16:3e:7a:1c:c9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '0c041a2f-082d-4b83-a004-33444bbe346a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4639934-cd6c-4019-971f-cbfc09a0bb49', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '20010402228a4b3d84e084ca5186ab15', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8af4c696-b929-46b1-9e0f-5694add283af c364b436-0288-43e1-bfce-1f6cd77763be', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.183'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=949f675f-c948-4974-951c-86031ef50831, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=db77c8b0-6e5e-4337-9535-4844aeb1612e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:22:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:08.678 104369 INFO neutron.agent.ovn.metadata.agent [-] Port db77c8b0-6e5e-4337-9535-4844aeb1612e in datapath a4639934-cd6c-4019-971f-cbfc09a0bb49 unbound from our chassis#033[00m
Nov 28 11:22:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:08.680 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a4639934-cd6c-4019-971f-cbfc09a0bb49, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.683 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:08.684 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e758ac99-f775-41e7-a3ba-0bd61fbb5f10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:08 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:08.685 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49 namespace which is not needed anymore#033[00m
Nov 28 11:22:08 np0005538960 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Nov 28 11:22:08 np0005538960 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Consumed 14.847s CPU time.
Nov 28 11:22:08 np0005538960 systemd-machined[153518]: Machine qemu-4-instance-0000000c terminated.
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.902 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49[216036]: [NOTICE]   (216040) : haproxy version is 2.8.14-c23fe91
Nov 28 11:22:08 np0005538960 neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49[216036]: [NOTICE]   (216040) : path to executable is /usr/sbin/haproxy
Nov 28 11:22:08 np0005538960 neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49[216036]: [WARNING]  (216040) : Exiting Master process...
Nov 28 11:22:08 np0005538960 neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49[216036]: [WARNING]  (216040) : Exiting Master process...
Nov 28 11:22:08 np0005538960 neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49[216036]: [ALERT]    (216040) : Current worker (216042) exited with code 143 (Terminated)
Nov 28 11:22:08 np0005538960 neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49[216036]: [WARNING]  (216040) : All workers exited. Exiting... (0)
Nov 28 11:22:08 np0005538960 systemd[1]: libpod-15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f.scope: Deactivated successfully.
Nov 28 11:22:08 np0005538960 podman[216279]: 2025-11-28 16:22:08.932690882 +0000 UTC m=+0.138398242 container died 15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.940 187256 INFO nova.virt.libvirt.driver [-] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Instance destroyed successfully.#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.941 187256 DEBUG nova.objects.instance [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lazy-loading 'resources' on Instance uuid 0c041a2f-082d-4b83-a004-33444bbe346a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.964 187256 DEBUG nova.virt.libvirt.vif [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:21:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerBasicOps-server-436562832',display_name='tempest-TestServerBasicOps-server-436562832',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testserverbasicops-server-436562832',id=12,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE4SUgJ/3SuIoocsAypVszwRFdJaTlBuNvnp5NUiZnz7NVCvXMPP3FOAZ5KHyay41JxUgLQw9/27RSv4yqOe1N4u4u/BjrDfnxYWONPYeUleQ/pUShAAGr9rYjCK4tcgSg==',key_name='tempest-TestServerBasicOps-1196511644',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:21:37Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={meta1='data1',meta2='data2',metaN='dataN'},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='20010402228a4b3d84e084ca5186ab15',ramdisk_id='',reservation_id='r-si9fpwjg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerBasicOps-1659108837',owner_user_name='tempest-TestServerBasicOps-1659108837-project-member',password_0='testtesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttesttest',password_1='',password_2='',password_3=''},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:22:06Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='ae30b34fc1a34e0f9ac06baa690e2e19',uuid=0c041a2f-082d-4b83-a004-33444bbe346a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.965 187256 DEBUG nova.network.os_vif_util [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Converting VIF {"id": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "address": "fa:16:3e:7a:1c:c9", "network": {"id": "a4639934-cd6c-4019-971f-cbfc09a0bb49", "bridge": "br-int", "label": "tempest-TestServerBasicOps-1805413059-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.183", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "20010402228a4b3d84e084ca5186ab15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdb77c8b0-6e", "ovs_interfaceid": "db77c8b0-6e5e-4337-9535-4844aeb1612e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.966 187256 DEBUG nova.network.os_vif_util [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7a:1c:c9,bridge_name='br-int',has_traffic_filtering=True,id=db77c8b0-6e5e-4337-9535-4844aeb1612e,network=Network(a4639934-cd6c-4019-971f-cbfc09a0bb49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb77c8b0-6e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.966 187256 DEBUG os_vif [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:1c:c9,bridge_name='br-int',has_traffic_filtering=True,id=db77c8b0-6e5e-4337-9535-4844aeb1612e,network=Network(a4639934-cd6c-4019-971f-cbfc09a0bb49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb77c8b0-6e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.968 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.969 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdb77c8b0-6e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.970 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.972 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.976 187256 INFO os_vif [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7a:1c:c9,bridge_name='br-int',has_traffic_filtering=True,id=db77c8b0-6e5e-4337-9535-4844aeb1612e,network=Network(a4639934-cd6c-4019-971f-cbfc09a0bb49),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdb77c8b0-6e')#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.977 187256 INFO nova.virt.libvirt.driver [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Deleting instance files /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a_del#033[00m
Nov 28 11:22:08 np0005538960 nova_compute[187252]: 2025-11-28 16:22:08.978 187256 INFO nova.virt.libvirt.driver [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Deletion of /var/lib/nova/instances/0c041a2f-082d-4b83-a004-33444bbe346a_del complete#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.044 187256 INFO nova.compute.manager [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.045 187256 DEBUG oslo.service.loopingcall [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.045 187256 DEBUG nova.compute.manager [-] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.046 187256 DEBUG nova.network.neutron [-] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:22:09 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f-userdata-shm.mount: Deactivated successfully.
Nov 28 11:22:09 np0005538960 podman[216279]: 2025-11-28 16:22:09.412931538 +0000 UTC m=+0.618638878 container cleanup 15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 11:22:09 np0005538960 systemd[1]: var-lib-containers-storage-overlay-b96b7b28316a76a0ef526b65aa81e8f333bd94e80e19e4c9ca5f15676f166613-merged.mount: Deactivated successfully.
Nov 28 11:22:09 np0005538960 systemd[1]: libpod-conmon-15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f.scope: Deactivated successfully.
Nov 28 11:22:09 np0005538960 podman[216325]: 2025-11-28 16:22:09.527490567 +0000 UTC m=+0.091693134 container remove 15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.534 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1e55165f-da42-4890-b573-c954416dd899]: (4, ('Fri Nov 28 04:22:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49 (15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f)\n15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f\nFri Nov 28 04:22:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49 (15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f)\n15d54767edadf6fe5e98d2ce3d93a376ce7ca4f2f95567799a16a2e44be9371f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.537 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[553656c8-dfb1-4028-8978-b6384de16500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.538 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4639934-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.541 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:09 np0005538960 kernel: tapa4639934-c0: left promiscuous mode
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.556 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.559 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[92b91c87-343f-4ed9-96b5-8cb68fe2ab87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.581 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[02f5214e-fee2-481a-a4ce-88d43b40c5f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.583 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[005c271d-4841-4133-9233-f1d530f5eb8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:09 np0005538960 podman[216323]: 2025-11-28 16:22:09.586019983 +0000 UTC m=+0.153184112 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.600 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[900cc865-0846-460a-85a5-b8a21eeb2a3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399519, 'reachable_time': 24926, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216360, 'error': None, 'target': 'ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:09 np0005538960 systemd[1]: run-netns-ovnmeta\x2da4639934\x2dcd6c\x2d4019\x2d971f\x2dcbfc09a0bb49.mount: Deactivated successfully.
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.605 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a4639934-cd6c-4019-971f-cbfc09a0bb49 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:22:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:09.605 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[959aa034-50da-4501-8517-75cfb8f92b6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.641 187256 DEBUG nova.compute.manager [req-8a580103-4856-40ed-a83d-aa5fad31e05b req-3d5ddf60-9496-4900-80ff-bd1d671b0668 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-vif-unplugged-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.642 187256 DEBUG oslo_concurrency.lockutils [req-8a580103-4856-40ed-a83d-aa5fad31e05b req-3d5ddf60-9496-4900-80ff-bd1d671b0668 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.642 187256 DEBUG oslo_concurrency.lockutils [req-8a580103-4856-40ed-a83d-aa5fad31e05b req-3d5ddf60-9496-4900-80ff-bd1d671b0668 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.642 187256 DEBUG oslo_concurrency.lockutils [req-8a580103-4856-40ed-a83d-aa5fad31e05b req-3d5ddf60-9496-4900-80ff-bd1d671b0668 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.642 187256 DEBUG nova.compute.manager [req-8a580103-4856-40ed-a83d-aa5fad31e05b req-3d5ddf60-9496-4900-80ff-bd1d671b0668 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] No waiting events found dispatching network-vif-unplugged-83c201fe-1c64-4b7f-a908-de59340b5670 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:22:09 np0005538960 nova_compute[187252]: 2025-11-28 16:22:09.643 187256 WARNING nova.compute.manager [req-8a580103-4856-40ed-a83d-aa5fad31e05b req-3d5ddf60-9496-4900-80ff-bd1d671b0668 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received unexpected event network-vif-unplugged-83c201fe-1c64-4b7f-a908-de59340b5670 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 28 11:22:10 np0005538960 systemd-logind[788]: New session 37 of user nova.
Nov 28 11:22:10 np0005538960 systemd[1]: Started Session 37 of User nova.
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.592 187256 DEBUG nova.network.neutron [-] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:22:10 np0005538960 systemd[1]: session-37.scope: Deactivated successfully.
Nov 28 11:22:10 np0005538960 systemd-logind[788]: Session 37 logged out. Waiting for processes to exit.
Nov 28 11:22:10 np0005538960 systemd-logind[788]: Removed session 37.
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.627 187256 INFO nova.compute.manager [-] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Took 1.58 seconds to deallocate network for instance.#033[00m
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.677 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.678 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:10 np0005538960 systemd-logind[788]: New session 38 of user nova.
Nov 28 11:22:10 np0005538960 systemd[1]: Started Session 38 of User nova.
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.772 187256 DEBUG nova.compute.provider_tree [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.784 187256 DEBUG nova.scheduler.client.report [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.801 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.831 187256 INFO nova.scheduler.client.report [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Deleted allocations for instance 0c041a2f-082d-4b83-a004-33444bbe346a#033[00m
Nov 28 11:22:10 np0005538960 systemd[1]: session-38.scope: Deactivated successfully.
Nov 28 11:22:10 np0005538960 systemd-logind[788]: Session 38 logged out. Waiting for processes to exit.
Nov 28 11:22:10 np0005538960 systemd-logind[788]: Removed session 38.
Nov 28 11:22:10 np0005538960 nova_compute[187252]: 2025-11-28 16:22:10.899 187256 DEBUG oslo_concurrency.lockutils [None req-6f03ae00-b308-4f1a-a7d4-381b07bf4cec ae30b34fc1a34e0f9ac06baa690e2e19 20010402228a4b3d84e084ca5186ab15 - - default default] Lock "0c041a2f-082d-4b83-a004-33444bbe346a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.278s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:10 np0005538960 systemd-logind[788]: New session 39 of user nova.
Nov 28 11:22:10 np0005538960 systemd[1]: Started Session 39 of User nova.
Nov 28 11:22:11 np0005538960 systemd[1]: session-39.scope: Deactivated successfully.
Nov 28 11:22:11 np0005538960 systemd-logind[788]: Session 39 logged out. Waiting for processes to exit.
Nov 28 11:22:11 np0005538960 systemd-logind[788]: Removed session 39.
Nov 28 11:22:11 np0005538960 nova_compute[187252]: 2025-11-28 16:22:11.744 187256 DEBUG nova.compute.manager [req-6df207d6-a156-4976-9bde-3453885fa269 req-795a5a5b-9e7a-4222-b619-dad57c8933fb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:11 np0005538960 nova_compute[187252]: 2025-11-28 16:22:11.745 187256 DEBUG oslo_concurrency.lockutils [req-6df207d6-a156-4976-9bde-3453885fa269 req-795a5a5b-9e7a-4222-b619-dad57c8933fb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:11 np0005538960 nova_compute[187252]: 2025-11-28 16:22:11.745 187256 DEBUG oslo_concurrency.lockutils [req-6df207d6-a156-4976-9bde-3453885fa269 req-795a5a5b-9e7a-4222-b619-dad57c8933fb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:11 np0005538960 nova_compute[187252]: 2025-11-28 16:22:11.745 187256 DEBUG oslo_concurrency.lockutils [req-6df207d6-a156-4976-9bde-3453885fa269 req-795a5a5b-9e7a-4222-b619-dad57c8933fb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:11 np0005538960 nova_compute[187252]: 2025-11-28 16:22:11.746 187256 DEBUG nova.compute.manager [req-6df207d6-a156-4976-9bde-3453885fa269 req-795a5a5b-9e7a-4222-b619-dad57c8933fb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] No waiting events found dispatching network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:22:11 np0005538960 nova_compute[187252]: 2025-11-28 16:22:11.746 187256 WARNING nova.compute.manager [req-6df207d6-a156-4976-9bde-3453885fa269 req-795a5a5b-9e7a-4222-b619-dad57c8933fb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received unexpected event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 28 11:22:11 np0005538960 nova_compute[187252]: 2025-11-28 16:22:11.746 187256 DEBUG nova.compute.manager [req-6df207d6-a156-4976-9bde-3453885fa269 req-795a5a5b-9e7a-4222-b619-dad57c8933fb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Received event network-vif-deleted-db77c8b0-6e5e-4337-9535-4844aeb1612e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:12 np0005538960 nova_compute[187252]: 2025-11-28 16:22:12.279 187256 INFO nova.network.neutron [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updating port 83c201fe-1c64-4b7f-a908-de59340b5670 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 28 11:22:12 np0005538960 nova_compute[187252]: 2025-11-28 16:22:12.690 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:13 np0005538960 nova_compute[187252]: 2025-11-28 16:22:13.972 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:14 np0005538960 nova_compute[187252]: 2025-11-28 16:22:14.901 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:22:14 np0005538960 nova_compute[187252]: 2025-11-28 16:22:14.901 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquired lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:22:14 np0005538960 nova_compute[187252]: 2025-11-28 16:22:14.901 187256 DEBUG nova.network.neutron [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.345 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.345 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.346 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.346 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.554 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.555 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5736MB free_disk=73.31403732299805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.555 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.556 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.605 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Applying migration context for instance dab22d6d-ad90-4a53-b395-4d8aa1875048 as it has an incoming, in-progress migration 1a20dd0d-100a-4d04-8eae-d58fe52c58ab. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.606 187256 INFO nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updating resource usage from migration 1a20dd0d-100a-4d04-8eae-d58fe52c58ab#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.626 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance dab22d6d-ad90-4a53-b395-4d8aa1875048 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.627 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.627 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.695 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.713 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.743 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:22:15 np0005538960 nova_compute[187252]: 2025-11-28 16:22:15.744 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:16 np0005538960 nova_compute[187252]: 2025-11-28 16:22:16.129 187256 DEBUG nova.compute.manager [req-b420028e-8528-4249-9eec-8fdc4ae1ecd3 req-3ce58197-437d-4105-aba7-cf2dab1de7f6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-changed-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:16 np0005538960 nova_compute[187252]: 2025-11-28 16:22:16.130 187256 DEBUG nova.compute.manager [req-b420028e-8528-4249-9eec-8fdc4ae1ecd3 req-3ce58197-437d-4105-aba7-cf2dab1de7f6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Refreshing instance network info cache due to event network-changed-83c201fe-1c64-4b7f-a908-de59340b5670. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:22:16 np0005538960 nova_compute[187252]: 2025-11-28 16:22:16.130 187256 DEBUG oslo_concurrency.lockutils [req-b420028e-8528-4249-9eec-8fdc4ae1ecd3 req-3ce58197-437d-4105-aba7-cf2dab1de7f6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:22:16 np0005538960 nova_compute[187252]: 2025-11-28 16:22:16.744 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:17 np0005538960 nova_compute[187252]: 2025-11-28 16:22:17.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:17 np0005538960 nova_compute[187252]: 2025-11-28 16:22:17.691 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:18 np0005538960 podman[216378]: 2025-11-28 16:22:18.17299885 +0000 UTC m=+0.072669031 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute)
Nov 28 11:22:18 np0005538960 nova_compute[187252]: 2025-11-28 16:22:18.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:18 np0005538960 nova_compute[187252]: 2025-11-28 16:22:18.903 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:18 np0005538960 nova_compute[187252]: 2025-11-28 16:22:18.974 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.317 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.317 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.337 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.741 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.876 187256 DEBUG nova.network.neutron [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updating instance_info_cache with network_info: [{"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.893 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Releasing lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.898 187256 DEBUG oslo_concurrency.lockutils [req-b420028e-8528-4249-9eec-8fdc4ae1ecd3 req-3ce58197-437d-4105-aba7-cf2dab1de7f6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.898 187256 DEBUG nova.network.neutron [req-b420028e-8528-4249-9eec-8fdc4ae1ecd3 req-3ce58197-437d-4105-aba7-cf2dab1de7f6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Refreshing network info cache for port 83c201fe-1c64-4b7f-a908-de59340b5670 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.979 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.982 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.982 187256 INFO nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Creating image(s)#033[00m
Nov 28 11:22:19 np0005538960 nova_compute[187252]: 2025-11-28 16:22:19.984 187256 DEBUG nova.objects.instance [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'trusted_certs' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.010 187256 DEBUG oslo_concurrency.processutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.046 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.080 187256 DEBUG oslo_concurrency.processutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.081 187256 DEBUG nova.virt.disk.api [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Checking if we can resize image /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.082 187256 DEBUG oslo_concurrency.processutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.145 187256 DEBUG oslo_concurrency.processutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.146 187256 DEBUG nova.virt.disk.api [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Cannot resize image /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.162 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.163 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Ensure instance console log exists: /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.165 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.165 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.166 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.173 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Start _get_guest_xml network_info=[{"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--703408347", "vif_mac": "fa:16:3e:d1:55:0f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.183 187256 WARNING nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.187 187256 DEBUG nova.virt.libvirt.host [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.188 187256 DEBUG nova.virt.libvirt.host [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.191 187256 DEBUG nova.virt.libvirt.host [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.192 187256 DEBUG nova.virt.libvirt.host [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.193 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.193 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.194 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.194 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.194 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.195 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.195 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.195 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.196 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.196 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.196 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.196 187256 DEBUG nova.virt.hardware [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.197 187256 DEBUG nova.objects.instance [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lazy-loading 'vcpu_model' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.213 187256 DEBUG oslo_concurrency.processutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.279 187256 DEBUG oslo_concurrency.processutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk.config --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.281 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Acquiring lock "/var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.281 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "/var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.282 187256 DEBUG oslo_concurrency.lockutils [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Lock "/var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.283 187256 DEBUG nova.virt.libvirt.vif [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1213806647',display_name='tempest-TestNetworkAdvancedServerOps-server-1213806647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1213806647',id=13,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH58GHoYAWcLcaiGAlD/LZfshXsPCF6Sn6OBas+pyQHOdCLCNFrTg+8Mm+RJG2mUOWsRqQEkkKfgShtb4l9p6G4CKRyYbKoZHPcpZa1CIwxB3EKlka8FFnYuHyFBQ07OUQ==',key_name='tempest-TestNetworkAdvancedServerOps-365393498',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:21:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-jm0yqf14',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:22:11Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=dab22d6d-ad90-4a53-b395-4d8aa1875048,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--703408347", "vif_mac": "fa:16:3e:d1:55:0f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.283 187256 DEBUG nova.network.os_vif_util [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Converting VIF {"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--703408347", "vif_mac": "fa:16:3e:d1:55:0f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.284 187256 DEBUG nova.network.os_vif_util [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:55:0f,bridge_name='br-int',has_traffic_filtering=True,id=83c201fe-1c64-4b7f-a908-de59340b5670,network=Network(e507f2fa-d0dd-4152-9c54-7c943a1fdc5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83c201fe-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.286 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <uuid>dab22d6d-ad90-4a53-b395-4d8aa1875048</uuid>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <name>instance-0000000d</name>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1213806647</nova:name>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:22:20</nova:creationTime>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        <nova:user uuid="5d381eba17324dd5ad798648b82d0115">tempest-TestNetworkAdvancedServerOps-762685809-project-member</nova:user>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        <nova:project uuid="7e408bace48b41a1ac0677d300b6d288">tempest-TestNetworkAdvancedServerOps-762685809</nova:project>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        <nova:port uuid="83c201fe-1c64-4b7f-a908-de59340b5670">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <entry name="serial">dab22d6d-ad90-4a53-b395-4d8aa1875048</entry>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <entry name="uuid">dab22d6d-ad90-4a53-b395-4d8aa1875048</entry>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/disk.config"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:d1:55:0f"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <target dev="tap83c201fe-1c"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/console.log" append="off"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:22:20 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:22:20 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:22:20 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:22:20 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.288 187256 DEBUG nova.virt.libvirt.vif [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1213806647',display_name='tempest-TestNetworkAdvancedServerOps-server-1213806647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1213806647',id=13,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH58GHoYAWcLcaiGAlD/LZfshXsPCF6Sn6OBas+pyQHOdCLCNFrTg+8Mm+RJG2mUOWsRqQEkkKfgShtb4l9p6G4CKRyYbKoZHPcpZa1CIwxB3EKlka8FFnYuHyFBQ07OUQ==',key_name='tempest-TestNetworkAdvancedServerOps-365393498',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:21:39Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-jm0yqf14',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:22:11Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=dab22d6d-ad90-4a53-b395-4d8aa1875048,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--703408347", "vif_mac": "fa:16:3e:d1:55:0f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.288 187256 DEBUG nova.network.os_vif_util [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Converting VIF {"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--703408347", "vif_mac": "fa:16:3e:d1:55:0f"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.288 187256 DEBUG nova.network.os_vif_util [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d1:55:0f,bridge_name='br-int',has_traffic_filtering=True,id=83c201fe-1c64-4b7f-a908-de59340b5670,network=Network(e507f2fa-d0dd-4152-9c54-7c943a1fdc5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83c201fe-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.289 187256 DEBUG os_vif [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:55:0f,bridge_name='br-int',has_traffic_filtering=True,id=83c201fe-1c64-4b7f-a908-de59340b5670,network=Network(e507f2fa-d0dd-4152-9c54-7c943a1fdc5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83c201fe-1c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.289 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.290 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.290 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.293 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.293 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap83c201fe-1c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.294 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap83c201fe-1c, col_values=(('external_ids', {'iface-id': '83c201fe-1c64-4b7f-a908-de59340b5670', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d1:55:0f', 'vm-uuid': 'dab22d6d-ad90-4a53-b395-4d8aa1875048'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.295 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.2970] manager: (tap83c201fe-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.297 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.305 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.307 187256 INFO os_vif [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d1:55:0f,bridge_name='br-int',has_traffic_filtering=True,id=83c201fe-1c64-4b7f-a908-de59340b5670,network=Network(e507f2fa-d0dd-4152-9c54-7c943a1fdc5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83c201fe-1c')#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.363 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.364 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.364 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] No VIF found with MAC fa:16:3e:d1:55:0f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.364 187256 INFO nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Using config drive#033[00m
Nov 28 11:22:20 np0005538960 kernel: tap83c201fe-1c: entered promiscuous mode
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.4370] manager: (tap83c201fe-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Nov 28 11:22:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:20Z|00068|binding|INFO|Claiming lport 83c201fe-1c64-4b7f-a908-de59340b5670 for this chassis.
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.437 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:20Z|00069|binding|INFO|83c201fe-1c64-4b7f-a908-de59340b5670: Claiming fa:16:3e:d1:55:0f 10.100.0.3
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.441 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.446 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.449 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.462 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.4635] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.4645] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Nov 28 11:22:20 np0005538960 systemd-udevd[216422]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.467 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:55:0f 10.100.0.3'], port_security=['fa:16:3e:d1:55:0f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dab22d6d-ad90-4a53-b395-4d8aa1875048', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '6', 'neutron:security_group_ids': '73670fc6-df48-4dc6-93f4-06266035780d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a285ce7-2378-4e7f-88d4-82e315e4ae1e, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=83c201fe-1c64-4b7f-a908-de59340b5670) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.469 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 83c201fe-1c64-4b7f-a908-de59340b5670 in datapath e507f2fa-d0dd-4152-9c54-7c943a1fdc5b bound to our chassis#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.472 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e507f2fa-d0dd-4152-9c54-7c943a1fdc5b#033[00m
Nov 28 11:22:20 np0005538960 systemd-machined[153518]: New machine qemu-5-instance-0000000d.
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.4820] device (tap83c201fe-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.4832] device (tap83c201fe-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.490 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[226939fc-1d20-4ccb-8bb3-7a257b37d724]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.491 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape507f2fa-d1 in ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.493 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape507f2fa-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.494 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b63f1ca3-6b20-4073-8b91-14fb04ba8e8d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.495 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8fe305-ab27-4d37-acae-65d916ff6b3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.506 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[a35fe26e-f3b0-4844-a3d6-693d12bb5c0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 systemd[1]: Started Virtual Machine qemu-5-instance-0000000d.
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.540 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[076ec092-16c2-4acd-ba1c-c50a444233a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.585 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[de3d4a0a-a1da-4147-b8bf-54420d0f1360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.605 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[495b2bef-1ee1-494e-ab71-0d90a5385fa2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.6072] manager: (tape507f2fa-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Nov 28 11:22:20 np0005538960 systemd-udevd[216425]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.649 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[e87f645c-c0b0-47f0-aea4-ae14b18486b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.652 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[9e49b8ce-a36e-4d5e-93a5-6ce48ca7d9bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.6812] device (tape507f2fa-d0): carrier: link connected
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.686 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[757075e7-d367-4577-927a-f3f43f7c1d27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.708 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7712ce-c1ae-4dd0-bbc9-76a1876af039]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape507f2fa-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:70:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404026, 'reachable_time': 38479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216456, 'error': None, 'target': 'ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.725 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.727 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ce4e0753-6620-48b2-b9f2-0f7190bc41a6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:7060'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404026, 'tstamp': 404026}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216457, 'error': None, 'target': 'ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.746 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f5cf6acd-d9c2-47e5-8542-38a4062f25ae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape507f2fa-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:70:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404026, 'reachable_time': 38479, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216458, 'error': None, 'target': 'ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.759 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:20Z|00070|binding|INFO|Setting lport 83c201fe-1c64-4b7f-a908-de59340b5670 ovn-installed in OVS
Nov 28 11:22:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:20Z|00071|binding|INFO|Setting lport 83c201fe-1c64-4b7f-a908-de59340b5670 up in Southbound
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.769 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.781 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[435c86d6-0d28-437c-b4e9-7d58cf3c5598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.853 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e72a1d23-5688-4596-bc7c-753d258cbde7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.855 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape507f2fa-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.855 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.856 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape507f2fa-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.858 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 NetworkManager[55548]: <info>  [1764346940.8587] manager: (tape507f2fa-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Nov 28 11:22:20 np0005538960 kernel: tape507f2fa-d0: entered promiscuous mode
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.862 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape507f2fa-d0, col_values=(('external_ids', {'iface-id': '07e83ec8-b82c-434a-a6ae-23785661b2f5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.863 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:20Z|00072|binding|INFO|Releasing lport 07e83ec8-b82c-434a-a6ae-23785661b2f5 from this chassis (sb_readonly=0)
Nov 28 11:22:20 np0005538960 nova_compute[187252]: 2025-11-28 16:22:20.894 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.896 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e507f2fa-d0dd-4152-9c54-7c943a1fdc5b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e507f2fa-d0dd-4152-9c54-7c943a1fdc5b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.898 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ae2dd2e9-2ee5-43cb-9b7f-1bf9fe3eddb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.898 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/e507f2fa-d0dd-4152-9c54-7c943a1fdc5b.pid.haproxy
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID e507f2fa-d0dd-4152-9c54-7c943a1fdc5b
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:22:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:20.899 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b', 'env', 'PROCESS_TAG=haproxy-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e507f2fa-d0dd-4152-9c54-7c943a1fdc5b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.031 187256 DEBUG nova.compute.manager [req-e0607512-72ce-4bd0-9141-70f850d9bc02 req-e8fee451-d89c-41a1-8128-fd04bd1f0a3b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.031 187256 DEBUG oslo_concurrency.lockutils [req-e0607512-72ce-4bd0-9141-70f850d9bc02 req-e8fee451-d89c-41a1-8128-fd04bd1f0a3b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.032 187256 DEBUG oslo_concurrency.lockutils [req-e0607512-72ce-4bd0-9141-70f850d9bc02 req-e8fee451-d89c-41a1-8128-fd04bd1f0a3b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.032 187256 DEBUG oslo_concurrency.lockutils [req-e0607512-72ce-4bd0-9141-70f850d9bc02 req-e8fee451-d89c-41a1-8128-fd04bd1f0a3b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.032 187256 DEBUG nova.compute.manager [req-e0607512-72ce-4bd0-9141-70f850d9bc02 req-e8fee451-d89c-41a1-8128-fd04bd1f0a3b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] No waiting events found dispatching network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.033 187256 WARNING nova.compute.manager [req-e0607512-72ce-4bd0-9141-70f850d9bc02 req-e8fee451-d89c-41a1-8128-fd04bd1f0a3b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received unexpected event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 for instance with vm_state active and task_state resize_finish.#033[00m
Nov 28 11:22:21 np0005538960 podman[216488]: 2025-11-28 16:22:21.301023053 +0000 UTC m=+0.056559348 container create 924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:22:21 np0005538960 systemd[1]: Started libpod-conmon-924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6.scope.
Nov 28 11:22:21 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:22:21 np0005538960 podman[216488]: 2025-11-28 16:22:21.272044627 +0000 UTC m=+0.027580952 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:22:21 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e51fb827aa5efac33200f62c419234325b6970feb70727c3cd7ad6476bb62e17/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:22:21 np0005538960 podman[216488]: 2025-11-28 16:22:21.383087382 +0000 UTC m=+0.138623707 container init 924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 11:22:21 np0005538960 podman[216488]: 2025-11-28 16:22:21.390746708 +0000 UTC m=+0.146283023 container start 924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 11:22:21 np0005538960 neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b[216504]: [NOTICE]   (216525) : New worker (216534) forked
Nov 28 11:22:21 np0005538960 neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b[216504]: [NOTICE]   (216525) : Loading success.
Nov 28 11:22:21 np0005538960 podman[216501]: 2025-11-28 16:22:21.437454286 +0000 UTC m=+0.094779600 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:22:21 np0005538960 systemd[1]: Stopping User Manager for UID 42436...
Nov 28 11:22:21 np0005538960 systemd[216234]: Activating special unit Exit the Session...
Nov 28 11:22:21 np0005538960 systemd[216234]: Stopped target Main User Target.
Nov 28 11:22:21 np0005538960 systemd[216234]: Stopped target Basic System.
Nov 28 11:22:21 np0005538960 systemd[216234]: Stopped target Paths.
Nov 28 11:22:21 np0005538960 systemd[216234]: Stopped target Sockets.
Nov 28 11:22:21 np0005538960 systemd[216234]: Stopped target Timers.
Nov 28 11:22:21 np0005538960 systemd[216234]: Stopped Mark boot as successful after the user session has run 2 minutes.
Nov 28 11:22:21 np0005538960 systemd[216234]: Stopped Daily Cleanup of User's Temporary Directories.
Nov 28 11:22:21 np0005538960 systemd[216234]: Closed D-Bus User Message Bus Socket.
Nov 28 11:22:21 np0005538960 systemd[216234]: Stopped Create User's Volatile Files and Directories.
Nov 28 11:22:21 np0005538960 systemd[216234]: Removed slice User Application Slice.
Nov 28 11:22:21 np0005538960 systemd[216234]: Reached target Shutdown.
Nov 28 11:22:21 np0005538960 systemd[216234]: Finished Exit the Session.
Nov 28 11:22:21 np0005538960 systemd[216234]: Reached target Exit the Session.
Nov 28 11:22:21 np0005538960 systemd[1]: user@42436.service: Deactivated successfully.
Nov 28 11:22:21 np0005538960 systemd[1]: Stopped User Manager for UID 42436.
Nov 28 11:22:21 np0005538960 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.624 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346941.6236823, dab22d6d-ad90-4a53-b395-4d8aa1875048 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.624 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.627 187256 DEBUG nova.compute.manager [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:22:21 np0005538960 systemd[1]: run-user-42436.mount: Deactivated successfully.
Nov 28 11:22:21 np0005538960 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Nov 28 11:22:21 np0005538960 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Nov 28 11:22:21 np0005538960 systemd[1]: Removed slice User Slice of UID 42436.
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.635 187256 INFO nova.virt.libvirt.driver [-] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Instance running successfully.#033[00m
Nov 28 11:22:21 np0005538960 virtqemud[186797]: argument unsupported: QEMU guest agent is not configured
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.638 187256 DEBUG nova.virt.libvirt.guest [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.638 187256 DEBUG nova.virt.libvirt.driver [None req-dc1e57ed-6b8b-4996-9ded-de60699a412e 2562cddf6ac84719b6b13614caba6d0f 09a0eb4614f44feca895f3fe31084d3a - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.651 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.659 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.695 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.696 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346941.6276486, dab22d6d-ad90-4a53-b395-4d8aa1875048 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.696 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] VM Started (Lifecycle Event)#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.724 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:22:21 np0005538960 nova_compute[187252]: 2025-11-28 16:22:21.732 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:22:22 np0005538960 nova_compute[187252]: 2025-11-28 16:22:22.694 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.558 187256 DEBUG nova.compute.manager [req-477f563a-70e2-4390-bc55-9503d57095e8 req-23f92089-8513-4256-9faf-4eeb3eda631b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.558 187256 DEBUG oslo_concurrency.lockutils [req-477f563a-70e2-4390-bc55-9503d57095e8 req-23f92089-8513-4256-9faf-4eeb3eda631b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.558 187256 DEBUG oslo_concurrency.lockutils [req-477f563a-70e2-4390-bc55-9503d57095e8 req-23f92089-8513-4256-9faf-4eeb3eda631b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.558 187256 DEBUG oslo_concurrency.lockutils [req-477f563a-70e2-4390-bc55-9503d57095e8 req-23f92089-8513-4256-9faf-4eeb3eda631b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.558 187256 DEBUG nova.compute.manager [req-477f563a-70e2-4390-bc55-9503d57095e8 req-23f92089-8513-4256-9faf-4eeb3eda631b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] No waiting events found dispatching network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.559 187256 WARNING nova.compute.manager [req-477f563a-70e2-4390-bc55-9503d57095e8 req-23f92089-8513-4256-9faf-4eeb3eda631b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received unexpected event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 for instance with vm_state resized and task_state None.#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.938 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764346928.9370298, 0c041a2f-082d-4b83-a004-33444bbe346a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.939 187256 INFO nova.compute.manager [-] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:22:23 np0005538960 nova_compute[187252]: 2025-11-28 16:22:23.963 187256 DEBUG nova.compute.manager [None req-d5641db3-0e50-4794-8776-df0768429a69 - - - - - -] [instance: 0c041a2f-082d-4b83-a004-33444bbe346a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:22:25 np0005538960 nova_compute[187252]: 2025-11-28 16:22:25.057 187256 DEBUG nova.network.neutron [req-b420028e-8528-4249-9eec-8fdc4ae1ecd3 req-3ce58197-437d-4105-aba7-cf2dab1de7f6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updated VIF entry in instance network info cache for port 83c201fe-1c64-4b7f-a908-de59340b5670. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:22:25 np0005538960 nova_compute[187252]: 2025-11-28 16:22:25.057 187256 DEBUG nova.network.neutron [req-b420028e-8528-4249-9eec-8fdc4ae1ecd3 req-3ce58197-437d-4105-aba7-cf2dab1de7f6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updating instance_info_cache with network_info: [{"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:22:25 np0005538960 nova_compute[187252]: 2025-11-28 16:22:25.079 187256 DEBUG oslo_concurrency.lockutils [req-b420028e-8528-4249-9eec-8fdc4ae1ecd3 req-3ce58197-437d-4105-aba7-cf2dab1de7f6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:22:25 np0005538960 nova_compute[187252]: 2025-11-28 16:22:25.080 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:22:25 np0005538960 nova_compute[187252]: 2025-11-28 16:22:25.080 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:22:25 np0005538960 nova_compute[187252]: 2025-11-28 16:22:25.081 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:25 np0005538960 nova_compute[187252]: 2025-11-28 16:22:25.297 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:26 np0005538960 nova_compute[187252]: 2025-11-28 16:22:26.241 187256 DEBUG nova.network.neutron [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Port 83c201fe-1c64-4b7f-a908-de59340b5670 binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Nov 28 11:22:26 np0005538960 nova_compute[187252]: 2025-11-28 16:22:26.242 187256 DEBUG oslo_concurrency.lockutils [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:22:27 np0005538960 nova_compute[187252]: 2025-11-28 16:22:27.696 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:28 np0005538960 podman[216552]: 2025-11-28 16:22:28.233002213 +0000 UTC m=+0.129524804 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:22:29 np0005538960 nova_compute[187252]: 2025-11-28 16:22:29.393 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updating instance_info_cache with network_info: [{"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:22:29 np0005538960 nova_compute[187252]: 2025-11-28 16:22:29.427 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:22:29 np0005538960 nova_compute[187252]: 2025-11-28 16:22:29.428 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:22:29 np0005538960 nova_compute[187252]: 2025-11-28 16:22:29.429 187256 DEBUG oslo_concurrency.lockutils [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:22:29 np0005538960 nova_compute[187252]: 2025-11-28 16:22:29.429 187256 DEBUG nova.network.neutron [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:22:29 np0005538960 nova_compute[187252]: 2025-11-28 16:22:29.430 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:29 np0005538960 nova_compute[187252]: 2025-11-28 16:22:29.430 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:30 np0005538960 nova_compute[187252]: 2025-11-28 16:22:30.299 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:30 np0005538960 nova_compute[187252]: 2025-11-28 16:22:30.424 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:22:32 np0005538960 podman[216581]: 2025-11-28 16:22:32.15440175 +0000 UTC m=+0.060619358 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd)
Nov 28 11:22:32 np0005538960 podman[216582]: 2025-11-28 16:22:32.154479972 +0000 UTC m=+0.057293657 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.698 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.708 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.709 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.729 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.828 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.829 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.837 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.837 187256 INFO nova.compute.claims [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:22:32 np0005538960 nova_compute[187252]: 2025-11-28 16:22:32.993 187256 DEBUG nova.compute.provider_tree [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.016 187256 DEBUG nova.scheduler.client.report [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.045 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.046 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.096 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.096 187256 DEBUG nova.network.neutron [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.140 187256 INFO nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.170 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.303 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.304 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.305 187256 INFO nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Creating image(s)#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.305 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "/var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.306 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "/var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.306 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "/var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.318 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.384 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.386 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.386 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.400 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.462 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.463 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.509 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.511 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.511 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.574 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.576 187256 DEBUG nova.virt.disk.api [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Checking if we can resize image /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.576 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:33Z|00073|binding|INFO|Releasing lport 07e83ec8-b82c-434a-a6ae-23785661b2f5 from this chassis (sb_readonly=0)
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.634 187256 DEBUG nova.network.neutron [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updating instance_info_cache with network_info: [{"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.637 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.640 187256 DEBUG nova.policy [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '070b2b4a7f634b70a38a8f51ce54dd63', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b95cbfd446a3402a9845b8e54a0539b1', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.646 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.646 187256 DEBUG nova.virt.disk.api [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Cannot resize image /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.647 187256 DEBUG nova.objects.instance [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lazy-loading 'migration_context' on Instance uuid a59a72c8-b3b5-407f-8a7a-f939a34b5c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.709 187256 DEBUG oslo_concurrency.lockutils [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.710 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.711 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Ensure instance console log exists: /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.711 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.711 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.712 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.737 187256 DEBUG nova.virt.libvirt.driver [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Creating tmpfile /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048/tmpxwtwjo2r to verify with other compute node that the instance is on the same shared storage. check_instance_shared_storage_local /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:9618#033[00m
Nov 28 11:22:33 np0005538960 kernel: tap83c201fe-1c (unregistering): left promiscuous mode
Nov 28 11:22:33 np0005538960 NetworkManager[55548]: <info>  [1764346953.7733] device (tap83c201fe-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:22:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:33Z|00074|binding|INFO|Releasing lport 83c201fe-1c64-4b7f-a908-de59340b5670 from this chassis (sb_readonly=0)
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.785 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:33Z|00075|binding|INFO|Setting lport 83c201fe-1c64-4b7f-a908-de59340b5670 down in Southbound
Nov 28 11:22:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:33Z|00076|binding|INFO|Removing iface tap83c201fe-1c ovn-installed in OVS
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.789 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:33.795 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d1:55:0f 10.100.0.3'], port_security=['fa:16:3e:d1:55:0f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dab22d6d-ad90-4a53-b395-4d8aa1875048', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '8', 'neutron:security_group_ids': '73670fc6-df48-4dc6-93f4-06266035780d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a285ce7-2378-4e7f-88d4-82e315e4ae1e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=83c201fe-1c64-4b7f-a908-de59340b5670) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:22:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:33.798 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 83c201fe-1c64-4b7f-a908-de59340b5670 in datapath e507f2fa-d0dd-4152-9c54-7c943a1fdc5b unbound from our chassis#033[00m
Nov 28 11:22:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:33.800 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e507f2fa-d0dd-4152-9c54-7c943a1fdc5b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:22:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:33.801 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[86da6ff0-fdf4-4b15-8497-f086a51d3f81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:33.802 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b namespace which is not needed anymore#033[00m
Nov 28 11:22:33 np0005538960 nova_compute[187252]: 2025-11-28 16:22:33.803 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:33 np0005538960 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Nov 28 11:22:33 np0005538960 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000d.scope: Consumed 13.059s CPU time.
Nov 28 11:22:33 np0005538960 systemd-machined[153518]: Machine qemu-5-instance-0000000d terminated.
Nov 28 11:22:33 np0005538960 neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b[216504]: [NOTICE]   (216525) : haproxy version is 2.8.14-c23fe91
Nov 28 11:22:33 np0005538960 neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b[216504]: [NOTICE]   (216525) : path to executable is /usr/sbin/haproxy
Nov 28 11:22:33 np0005538960 neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b[216504]: [ALERT]    (216525) : Current worker (216534) exited with code 143 (Terminated)
Nov 28 11:22:33 np0005538960 neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b[216504]: [WARNING]  (216525) : All workers exited. Exiting... (0)
Nov 28 11:22:33 np0005538960 systemd[1]: libpod-924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6.scope: Deactivated successfully.
Nov 28 11:22:33 np0005538960 podman[216669]: 2025-11-28 16:22:33.955360483 +0000 UTC m=+0.053816241 container died 924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 11:22:33 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6-userdata-shm.mount: Deactivated successfully.
Nov 28 11:22:33 np0005538960 systemd[1]: var-lib-containers-storage-overlay-e51fb827aa5efac33200f62c419234325b6970feb70727c3cd7ad6476bb62e17-merged.mount: Deactivated successfully.
Nov 28 11:22:34 np0005538960 podman[216669]: 2025-11-28 16:22:34.005656148 +0000 UTC m=+0.104111896 container cleanup 924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:22:34 np0005538960 systemd[1]: libpod-conmon-924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6.scope: Deactivated successfully.
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.022 187256 INFO nova.virt.libvirt.driver [-] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Instance destroyed successfully.#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.022 187256 DEBUG nova.objects.instance [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'resources' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.036 187256 DEBUG nova.virt.libvirt.vif [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1213806647',display_name='tempest-TestNetworkAdvancedServerOps-server-1213806647',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1213806647',id=13,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH58GHoYAWcLcaiGAlD/LZfshXsPCF6Sn6OBas+pyQHOdCLCNFrTg+8Mm+RJG2mUOWsRqQEkkKfgShtb4l9p6G4CKRyYbKoZHPcpZa1CIwxB3EKlka8FFnYuHyFBQ07OUQ==',key_name='tempest-TestNetworkAdvancedServerOps-365393498',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:22:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-jm0yqf14',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:22:21Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=dab22d6d-ad90-4a53-b395-4d8aa1875048,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.037 187256 DEBUG nova.network.os_vif_util [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.037 187256 DEBUG nova.network.os_vif_util [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d1:55:0f,bridge_name='br-int',has_traffic_filtering=True,id=83c201fe-1c64-4b7f-a908-de59340b5670,network=Network(e507f2fa-d0dd-4152-9c54-7c943a1fdc5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83c201fe-1c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.038 187256 DEBUG os_vif [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:55:0f,bridge_name='br-int',has_traffic_filtering=True,id=83c201fe-1c64-4b7f-a908-de59340b5670,network=Network(e507f2fa-d0dd-4152-9c54-7c943a1fdc5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83c201fe-1c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.040 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.040 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap83c201fe-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.042 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.044 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.046 187256 INFO os_vif [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d1:55:0f,bridge_name='br-int',has_traffic_filtering=True,id=83c201fe-1c64-4b7f-a908-de59340b5670,network=Network(e507f2fa-d0dd-4152-9c54-7c943a1fdc5b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap83c201fe-1c')#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.047 187256 INFO nova.virt.libvirt.driver [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Deleting instance files /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048_del#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.054 187256 INFO nova.virt.libvirt.driver [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Deletion of /var/lib/nova/instances/dab22d6d-ad90-4a53-b395-4d8aa1875048_del complete#033[00m
Nov 28 11:22:34 np0005538960 podman[216714]: 2025-11-28 16:22:34.074657658 +0000 UTC m=+0.045559630 container remove 924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.081 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2cdff545-8cf0-44a0-bc10-b8a3c016277a]: (4, ('Fri Nov 28 04:22:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b (924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6)\n924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6\nFri Nov 28 04:22:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b (924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6)\n924640a5a9aa0f44af1f42e54f79f4f99b8783fbfeeb9390f21a2dc133bcb6e6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.083 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b96b2f-363e-4e44-9e0e-e1751082f425]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.084 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape507f2fa-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.086 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:34 np0005538960 kernel: tape507f2fa-d0: left promiscuous mode
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.088 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.093 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[bf7442b8-7c1e-40cd-b479-d476ff555902]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.099 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.106 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a2185adc-0925-43ce-8df2-37568eb86ffc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.108 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[83b24af6-8fce-4d1e-b0a9-b1e8482a1d79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.124 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb9350e-e12e-49ff-bd14-0e3d2f4cac31]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404015, 'reachable_time': 28530, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216730, 'error': None, 'target': 'ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.128 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e507f2fa-d0dd-4152-9c54-7c943a1fdc5b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:22:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:34.128 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c55de6-fdb5-4e79-a002-42bbb522b63d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:34 np0005538960 systemd[1]: run-netns-ovnmeta\x2de507f2fa\x2dd0dd\x2d4152\x2d9c54\x2d7c943a1fdc5b.mount: Deactivated successfully.
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.183 187256 DEBUG oslo_concurrency.lockutils [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.184 187256 DEBUG oslo_concurrency.lockutils [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.198 187256 DEBUG nova.compute.manager [req-2479d93a-07b5-4615-91b9-217f8fa631db req-d8454e97-7156-4720-b062-76467df0f64c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-vif-unplugged-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.199 187256 DEBUG oslo_concurrency.lockutils [req-2479d93a-07b5-4615-91b9-217f8fa631db req-d8454e97-7156-4720-b062-76467df0f64c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.199 187256 DEBUG oslo_concurrency.lockutils [req-2479d93a-07b5-4615-91b9-217f8fa631db req-d8454e97-7156-4720-b062-76467df0f64c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.199 187256 DEBUG oslo_concurrency.lockutils [req-2479d93a-07b5-4615-91b9-217f8fa631db req-d8454e97-7156-4720-b062-76467df0f64c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.199 187256 DEBUG nova.compute.manager [req-2479d93a-07b5-4615-91b9-217f8fa631db req-d8454e97-7156-4720-b062-76467df0f64c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] No waiting events found dispatching network-vif-unplugged-83c201fe-1c64-4b7f-a908-de59340b5670 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.200 187256 WARNING nova.compute.manager [req-2479d93a-07b5-4615-91b9-217f8fa631db req-d8454e97-7156-4720-b062-76467df0f64c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received unexpected event network-vif-unplugged-83c201fe-1c64-4b7f-a908-de59340b5670 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.226 187256 DEBUG nova.objects.instance [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'migration_context' on Instance uuid dab22d6d-ad90-4a53-b395-4d8aa1875048 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.326 187256 DEBUG nova.compute.provider_tree [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.358 187256 DEBUG nova.scheduler.client.report [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:22:34 np0005538960 nova_compute[187252]: 2025-11-28 16:22:34.426 187256 DEBUG oslo_concurrency.lockutils [None req-3b405095-8394-46c8-9ff2-dbecc580d94c 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.242s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.309 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.310 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.311 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:22:35.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:22:35 np0005538960 nova_compute[187252]: 2025-11-28 16:22:35.676 187256 DEBUG nova.network.neutron [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Successfully created port: 6afaadc4-fd5f-49fa-80df-cf437202f1e3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:22:36 np0005538960 nova_compute[187252]: 2025-11-28 16:22:36.350 187256 DEBUG nova.compute.manager [req-846a65d9-89c1-41c9-8ca4-264940c17ae1 req-a9cea1b4-894b-40bd-a0d6-3a12c3f09987 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:36 np0005538960 nova_compute[187252]: 2025-11-28 16:22:36.350 187256 DEBUG oslo_concurrency.lockutils [req-846a65d9-89c1-41c9-8ca4-264940c17ae1 req-a9cea1b4-894b-40bd-a0d6-3a12c3f09987 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:36 np0005538960 nova_compute[187252]: 2025-11-28 16:22:36.350 187256 DEBUG oslo_concurrency.lockutils [req-846a65d9-89c1-41c9-8ca4-264940c17ae1 req-a9cea1b4-894b-40bd-a0d6-3a12c3f09987 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:36 np0005538960 nova_compute[187252]: 2025-11-28 16:22:36.351 187256 DEBUG oslo_concurrency.lockutils [req-846a65d9-89c1-41c9-8ca4-264940c17ae1 req-a9cea1b4-894b-40bd-a0d6-3a12c3f09987 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:36 np0005538960 nova_compute[187252]: 2025-11-28 16:22:36.351 187256 DEBUG nova.compute.manager [req-846a65d9-89c1-41c9-8ca4-264940c17ae1 req-a9cea1b4-894b-40bd-a0d6-3a12c3f09987 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] No waiting events found dispatching network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:22:36 np0005538960 nova_compute[187252]: 2025-11-28 16:22:36.351 187256 WARNING nova.compute.manager [req-846a65d9-89c1-41c9-8ca4-264940c17ae1 req-a9cea1b4-894b-40bd-a0d6-3a12c3f09987 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received unexpected event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 28 11:22:37 np0005538960 podman[216731]: 2025-11-28 16:22:37.155669788 +0000 UTC m=+0.055372980 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:22:37 np0005538960 nova_compute[187252]: 2025-11-28 16:22:37.701 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:38 np0005538960 nova_compute[187252]: 2025-11-28 16:22:38.042 187256 DEBUG nova.network.neutron [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Successfully updated port: 6afaadc4-fd5f-49fa-80df-cf437202f1e3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:22:38 np0005538960 nova_compute[187252]: 2025-11-28 16:22:38.065 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:22:38 np0005538960 nova_compute[187252]: 2025-11-28 16:22:38.066 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquired lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:22:38 np0005538960 nova_compute[187252]: 2025-11-28 16:22:38.066 187256 DEBUG nova.network.neutron [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:22:38 np0005538960 nova_compute[187252]: 2025-11-28 16:22:38.873 187256 DEBUG nova.network.neutron [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:22:39 np0005538960 nova_compute[187252]: 2025-11-28 16:22:39.084 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.008 187256 DEBUG nova.compute.manager [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-changed-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.009 187256 DEBUG nova.compute.manager [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Refreshing instance network info cache due to event network-changed-83c201fe-1c64-4b7f-a908-de59340b5670. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.009 187256 DEBUG oslo_concurrency.lockutils [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.009 187256 DEBUG oslo_concurrency.lockutils [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.009 187256 DEBUG nova.network.neutron [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Refreshing network info cache for port 83c201fe-1c64-4b7f-a908-de59340b5670 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:22:40 np0005538960 podman[216755]: 2025-11-28 16:22:40.161081874 +0000 UTC m=+0.068130870 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, release=1755695350, version=9.6, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.872 187256 DEBUG nova.network.neutron [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updating instance_info_cache with network_info: [{"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.906 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Releasing lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.907 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Instance network_info: |[{"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.910 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Start _get_guest_xml network_info=[{"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.916 187256 WARNING nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.929 187256 DEBUG nova.virt.libvirt.host [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.930 187256 DEBUG nova.virt.libvirt.host [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.936 187256 DEBUG nova.virt.libvirt.host [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.937 187256 DEBUG nova.virt.libvirt.host [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.939 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.941 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.942 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.942 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.942 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.942 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.943 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.943 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.944 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.945 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.945 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.946 187256 DEBUG nova.virt.hardware [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.949 187256 DEBUG nova.virt.libvirt.vif [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:22:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-890711526-access_point-863719353',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-890711526-access_point-863719353',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-890711526-acc',id=16,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL4T/R7E/vFcSeE0fzLvkiSoTLLYGisURb0A/0geduwhytTpHsgju71rV8vWYR0ERy2GKPWTieugKB6vZ3VOh75uVQKt+R6zhf4sxILFG2K8YDw/c++kFBMV6Ggxna9ifg==',key_name='tempest-TestSecurityGroupsBasicOps-1568398266',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b95cbfd446a3402a9845b8e54a0539b1',ramdisk_id='',reservation_id='r-pibpvlng',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-890711526',owner_user_name='tempest-TestSecurityGroupsBasicOps-890711526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:22:33Z,user_data=None,user_id='070b2b4a7f634b70a38a8f51ce54dd63',uuid=a59a72c8-b3b5-407f-8a7a-f939a34b5c75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.950 187256 DEBUG nova.network.os_vif_util [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Converting VIF {"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.951 187256 DEBUG nova.network.os_vif_util [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:a5:6f,bridge_name='br-int',has_traffic_filtering=True,id=6afaadc4-fd5f-49fa-80df-cf437202f1e3,network=Network(a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6afaadc4-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.953 187256 DEBUG nova.objects.instance [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lazy-loading 'pci_devices' on Instance uuid a59a72c8-b3b5-407f-8a7a-f939a34b5c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.977 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <uuid>a59a72c8-b3b5-407f-8a7a-f939a34b5c75</uuid>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <name>instance-00000010</name>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-890711526-access_point-863719353</nova:name>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:22:40</nova:creationTime>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        <nova:user uuid="070b2b4a7f634b70a38a8f51ce54dd63">tempest-TestSecurityGroupsBasicOps-890711526-project-member</nova:user>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        <nova:project uuid="b95cbfd446a3402a9845b8e54a0539b1">tempest-TestSecurityGroupsBasicOps-890711526</nova:project>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        <nova:port uuid="6afaadc4-fd5f-49fa-80df-cf437202f1e3">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <entry name="serial">a59a72c8-b3b5-407f-8a7a-f939a34b5c75</entry>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <entry name="uuid">a59a72c8-b3b5-407f-8a7a-f939a34b5c75</entry>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk.config"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:85:a5:6f"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <target dev="tap6afaadc4-fd"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/console.log" append="off"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:22:40 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:22:40 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:22:40 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:22:40 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.979 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Preparing to wait for external event network-vif-plugged-6afaadc4-fd5f-49fa-80df-cf437202f1e3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.979 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.979 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.980 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.980 187256 DEBUG nova.virt.libvirt.vif [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:22:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-890711526-access_point-863719353',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-890711526-access_point-863719353',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-890711526-acc',id=16,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL4T/R7E/vFcSeE0fzLvkiSoTLLYGisURb0A/0geduwhytTpHsgju71rV8vWYR0ERy2GKPWTieugKB6vZ3VOh75uVQKt+R6zhf4sxILFG2K8YDw/c++kFBMV6Ggxna9ifg==',key_name='tempest-TestSecurityGroupsBasicOps-1568398266',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b95cbfd446a3402a9845b8e54a0539b1',ramdisk_id='',reservation_id='r-pibpvlng',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-890711526',owner_user_name='tempest-TestSecurityGroupsBasicOps-890711526-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:22:33Z,user_data=None,user_id='070b2b4a7f634b70a38a8f51ce54dd63',uuid=a59a72c8-b3b5-407f-8a7a-f939a34b5c75,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.981 187256 DEBUG nova.network.os_vif_util [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Converting VIF {"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.981 187256 DEBUG nova.network.os_vif_util [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:a5:6f,bridge_name='br-int',has_traffic_filtering=True,id=6afaadc4-fd5f-49fa-80df-cf437202f1e3,network=Network(a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6afaadc4-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.982 187256 DEBUG os_vif [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:a5:6f,bridge_name='br-int',has_traffic_filtering=True,id=6afaadc4-fd5f-49fa-80df-cf437202f1e3,network=Network(a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6afaadc4-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.982 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.983 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.983 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.986 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.987 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6afaadc4-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.987 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6afaadc4-fd, col_values=(('external_ids', {'iface-id': '6afaadc4-fd5f-49fa-80df-cf437202f1e3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:a5:6f', 'vm-uuid': 'a59a72c8-b3b5-407f-8a7a-f939a34b5c75'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:40 np0005538960 NetworkManager[55548]: <info>  [1764346960.9904] manager: (tap6afaadc4-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.989 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.993 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:22:40 np0005538960 nova_compute[187252]: 2025-11-28 16:22:40.998 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.000 187256 INFO os_vif [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:a5:6f,bridge_name='br-int',has_traffic_filtering=True,id=6afaadc4-fd5f-49fa-80df-cf437202f1e3,network=Network(a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6afaadc4-fd')#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.067 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.067 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.067 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] No VIF found with MAC fa:16:3e:85:a5:6f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.068 187256 INFO nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Using config drive#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.919 187256 DEBUG nova.compute.manager [req-62c937e0-3479-443b-a1ea-32b05e82365a req-46fd7c52-31a5-4834-8b05-cb57755e8416 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.920 187256 DEBUG oslo_concurrency.lockutils [req-62c937e0-3479-443b-a1ea-32b05e82365a req-46fd7c52-31a5-4834-8b05-cb57755e8416 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.920 187256 DEBUG oslo_concurrency.lockutils [req-62c937e0-3479-443b-a1ea-32b05e82365a req-46fd7c52-31a5-4834-8b05-cb57755e8416 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.920 187256 DEBUG oslo_concurrency.lockutils [req-62c937e0-3479-443b-a1ea-32b05e82365a req-46fd7c52-31a5-4834-8b05-cb57755e8416 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "dab22d6d-ad90-4a53-b395-4d8aa1875048-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.921 187256 DEBUG nova.compute.manager [req-62c937e0-3479-443b-a1ea-32b05e82365a req-46fd7c52-31a5-4834-8b05-cb57755e8416 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] No waiting events found dispatching network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:22:41 np0005538960 nova_compute[187252]: 2025-11-28 16:22:41.921 187256 WARNING nova.compute.manager [req-62c937e0-3479-443b-a1ea-32b05e82365a req-46fd7c52-31a5-4834-8b05-cb57755e8416 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Received unexpected event network-vif-plugged-83c201fe-1c64-4b7f-a908-de59340b5670 for instance with vm_state resized and task_state resize_reverting.#033[00m
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.265 187256 INFO nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Creating config drive at /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk.config#033[00m
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.269 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbil02hke execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.396 187256 DEBUG oslo_concurrency.processutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbil02hke" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:22:42 np0005538960 kernel: tap6afaadc4-fd: entered promiscuous mode
Nov 28 11:22:42 np0005538960 NetworkManager[55548]: <info>  [1764346962.4631] manager: (tap6afaadc4-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.464 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:42 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:42Z|00077|binding|INFO|Claiming lport 6afaadc4-fd5f-49fa-80df-cf437202f1e3 for this chassis.
Nov 28 11:22:42 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:42Z|00078|binding|INFO|6afaadc4-fd5f-49fa-80df-cf437202f1e3: Claiming fa:16:3e:85:a5:6f 10.100.0.5
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.483 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:a5:6f 10.100.0.5'], port_security=['fa:16:3e:85:a5:6f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a59a72c8-b3b5-407f-8a7a-f939a34b5c75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b95cbfd446a3402a9845b8e54a0539b1', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3dc8fbc7-f735-4159-bfeb-4b9a5ad39eb4 c5cc017c-c3ae-4527-bcc3-ef0b586790a1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ceedba3-308b-43c0-901a-53307d08facf, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=6afaadc4-fd5f-49fa-80df-cf437202f1e3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.484 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.485 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 6afaadc4-fd5f-49fa-80df-cf437202f1e3 in datapath a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d bound to our chassis#033[00m
Nov 28 11:22:42 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:42Z|00079|binding|INFO|Setting lport 6afaadc4-fd5f-49fa-80df-cf437202f1e3 ovn-installed in OVS
Nov 28 11:22:42 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:42Z|00080|binding|INFO|Setting lport 6afaadc4-fd5f-49fa-80df-cf437202f1e3 up in Southbound
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.487 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d#033[00m
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.489 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:42 np0005538960 systemd-udevd[216797]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.500 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3e4ebbed-7b5a-4875-9d2f-bf5e3ae24fe5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.501 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa4ee5c01-b1 in ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.503 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa4ee5c01-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.503 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6272a16b-ea30-4528-af21-bb400f5d08c7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.504 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[64e5e587-f31e-48a1-b86a-10d8759a9528]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 systemd-machined[153518]: New machine qemu-6-instance-00000010.
Nov 28 11:22:42 np0005538960 NetworkManager[55548]: <info>  [1764346962.5176] device (tap6afaadc4-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:22:42 np0005538960 NetworkManager[55548]: <info>  [1764346962.5203] device (tap6afaadc4-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.519 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[00c8058b-84db-4671-882b-f83231fa5409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 systemd[1]: Started Virtual Machine qemu-6-instance-00000010.
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.537 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[70ce27e8-23da-405e-9b87-83ea3fa49727]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.572 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[510355aa-eaee-4afd-bc13-d9953692c10c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 NetworkManager[55548]: <info>  [1764346962.5811] manager: (tapa4ee5c01-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.579 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2eaafc9a-1971-4520-8d24-6c4581a8b91e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.618 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2a63d8-904a-4c38-a3fa-164e6c206111]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.621 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[c7527d42-9671-4453-a221-174aa50001fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 NetworkManager[55548]: <info>  [1764346962.6509] device (tapa4ee5c01-b0): carrier: link connected
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.657 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[b57ffad6-e5d5-46e7-a4ab-1f05c4de9218]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.680 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2f0a3e81-518f-4ab6-8041-de3fba7dba99]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa4ee5c01-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:f9:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406223, 'reachable_time': 25846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216830, 'error': None, 'target': 'ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.700 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2fe059-ee81-4917-92b9-452923ce6bf2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:f980'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 406223, 'tstamp': 406223}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216831, 'error': None, 'target': 'ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.706 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.724 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fa2bd639-255e-4df8-91e5-c1044e73e54b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa4ee5c01-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:f9:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 27], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406223, 'reachable_time': 25846, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216832, 'error': None, 'target': 'ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.762 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac543ae-25c3-42fa-96cb-07dc165f2990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.836 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c437f68f-8688-4e62-b261-ebbf83c7bfb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.838 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4ee5c01-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.839 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.839 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4ee5c01-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.912 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:42 np0005538960 NetworkManager[55548]: <info>  [1764346962.9136] manager: (tapa4ee5c01-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Nov 28 11:22:42 np0005538960 kernel: tapa4ee5c01-b0: entered promiscuous mode
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.920 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa4ee5c01-b0, col_values=(('external_ids', {'iface-id': 'ec7a1044-99b4-4f30-9983-8c85fdd93798'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.921 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:42 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:42Z|00081|binding|INFO|Releasing lport ec7a1044-99b4-4f30-9983-8c85fdd93798 from this chassis (sb_readonly=0)
Nov 28 11:22:42 np0005538960 nova_compute[187252]: 2025-11-28 16:22:42.938 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.939 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.941 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[76b711de-a432-4f6c-bb5b-f044689f0bb1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.942 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d.pid.haproxy
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:22:42 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:22:42.943 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d', 'env', 'PROCESS_TAG=haproxy-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.275 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346963.2741158, a59a72c8-b3b5-407f-8a7a-f939a34b5c75 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.276 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] VM Started (Lifecycle Event)#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.307 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.313 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346963.2758648, a59a72c8-b3b5-407f-8a7a-f939a34b5c75 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.313 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.352 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.355 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.379 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:22:43 np0005538960 podman[216871]: 2025-11-28 16:22:43.387776161 +0000 UTC m=+0.053783892 container create 9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:22:43 np0005538960 systemd[1]: Started libpod-conmon-9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59.scope.
Nov 28 11:22:43 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:22:43 np0005538960 podman[216871]: 2025-11-28 16:22:43.358748323 +0000 UTC m=+0.024756084 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:22:43 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9a0cce5f49af2bd13705321e62d1cf9a5dec24c69323a5765bff0e52228f4c80/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:22:43 np0005538960 podman[216871]: 2025-11-28 16:22:43.474032451 +0000 UTC m=+0.140040202 container init 9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 11:22:43 np0005538960 podman[216871]: 2025-11-28 16:22:43.480209792 +0000 UTC m=+0.146217523 container start 9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 11:22:43 np0005538960 neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d[216886]: [NOTICE]   (216890) : New worker (216892) forked
Nov 28 11:22:43 np0005538960 neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d[216886]: [NOTICE]   (216890) : Loading success.
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.511 187256 DEBUG nova.network.neutron [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updated VIF entry in instance network info cache for port 83c201fe-1c64-4b7f-a908-de59340b5670. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.512 187256 DEBUG nova.network.neutron [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Updating instance_info_cache with network_info: [{"id": "83c201fe-1c64-4b7f-a908-de59340b5670", "address": "fa:16:3e:d1:55:0f", "network": {"id": "e507f2fa-d0dd-4152-9c54-7c943a1fdc5b", "bridge": "br-int", "label": "tempest-network-smoke--703408347", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap83c201fe-1c", "ovs_interfaceid": "83c201fe-1c64-4b7f-a908-de59340b5670", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.553 187256 DEBUG oslo_concurrency.lockutils [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-dab22d6d-ad90-4a53-b395-4d8aa1875048" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.554 187256 DEBUG nova.compute.manager [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Received event network-changed-6afaadc4-fd5f-49fa-80df-cf437202f1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.554 187256 DEBUG nova.compute.manager [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Refreshing instance network info cache due to event network-changed-6afaadc4-fd5f-49fa-80df-cf437202f1e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.554 187256 DEBUG oslo_concurrency.lockutils [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.554 187256 DEBUG oslo_concurrency.lockutils [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:22:43 np0005538960 nova_compute[187252]: 2025-11-28 16:22:43.554 187256 DEBUG nova.network.neutron [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Refreshing network info cache for port 6afaadc4-fd5f-49fa-80df-cf437202f1e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:22:45 np0005538960 nova_compute[187252]: 2025-11-28 16:22:45.991 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.282 187256 DEBUG nova.compute.manager [req-b67dc8d6-1b3c-4197-b9d3-e45a06eec96b req-93a7713f-807c-41b6-9ade-6cf7860de294 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Received event network-vif-plugged-6afaadc4-fd5f-49fa-80df-cf437202f1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.282 187256 DEBUG oslo_concurrency.lockutils [req-b67dc8d6-1b3c-4197-b9d3-e45a06eec96b req-93a7713f-807c-41b6-9ade-6cf7860de294 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.283 187256 DEBUG oslo_concurrency.lockutils [req-b67dc8d6-1b3c-4197-b9d3-e45a06eec96b req-93a7713f-807c-41b6-9ade-6cf7860de294 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.283 187256 DEBUG oslo_concurrency.lockutils [req-b67dc8d6-1b3c-4197-b9d3-e45a06eec96b req-93a7713f-807c-41b6-9ade-6cf7860de294 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.283 187256 DEBUG nova.compute.manager [req-b67dc8d6-1b3c-4197-b9d3-e45a06eec96b req-93a7713f-807c-41b6-9ade-6cf7860de294 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Processing event network-vif-plugged-6afaadc4-fd5f-49fa-80df-cf437202f1e3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.284 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.299 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764346966.2994647, a59a72c8-b3b5-407f-8a7a-f939a34b5c75 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.300 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.303 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.307 187256 INFO nova.virt.libvirt.driver [-] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Instance spawned successfully.#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.307 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.350 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.356 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.356 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.356 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.357 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.357 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.357 187256 DEBUG nova.virt.libvirt.driver [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.361 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.403 187256 DEBUG nova.network.neutron [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updated VIF entry in instance network info cache for port 6afaadc4-fd5f-49fa-80df-cf437202f1e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.404 187256 DEBUG nova.network.neutron [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updating instance_info_cache with network_info: [{"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.408 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.460 187256 DEBUG oslo_concurrency.lockutils [req-3cf18960-e94d-4af6-a792-6af0b452f95e req-ffe51520-2a64-4b9f-9229-43037ad385e0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.475 187256 INFO nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Took 13.17 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.476 187256 DEBUG nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.567 187256 INFO nova.compute.manager [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Took 13.77 seconds to build instance.#033[00m
Nov 28 11:22:46 np0005538960 nova_compute[187252]: 2025-11-28 16:22:46.584 187256 DEBUG oslo_concurrency.lockutils [None req-067bf4e1-a848-49d7-98dc-c1c7600e70e1 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:47 np0005538960 nova_compute[187252]: 2025-11-28 16:22:47.711 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:48 np0005538960 nova_compute[187252]: 2025-11-28 16:22:48.718 187256 DEBUG nova.compute.manager [req-2478cff4-7f7b-44b5-bd61-05229423daa0 req-15d6d9ab-97fb-46e1-85f6-cbd8e24c00d5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Received event network-vif-plugged-6afaadc4-fd5f-49fa-80df-cf437202f1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:48 np0005538960 nova_compute[187252]: 2025-11-28 16:22:48.718 187256 DEBUG oslo_concurrency.lockutils [req-2478cff4-7f7b-44b5-bd61-05229423daa0 req-15d6d9ab-97fb-46e1-85f6-cbd8e24c00d5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:22:48 np0005538960 nova_compute[187252]: 2025-11-28 16:22:48.718 187256 DEBUG oslo_concurrency.lockutils [req-2478cff4-7f7b-44b5-bd61-05229423daa0 req-15d6d9ab-97fb-46e1-85f6-cbd8e24c00d5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:22:48 np0005538960 nova_compute[187252]: 2025-11-28 16:22:48.718 187256 DEBUG oslo_concurrency.lockutils [req-2478cff4-7f7b-44b5-bd61-05229423daa0 req-15d6d9ab-97fb-46e1-85f6-cbd8e24c00d5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:22:48 np0005538960 nova_compute[187252]: 2025-11-28 16:22:48.719 187256 DEBUG nova.compute.manager [req-2478cff4-7f7b-44b5-bd61-05229423daa0 req-15d6d9ab-97fb-46e1-85f6-cbd8e24c00d5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] No waiting events found dispatching network-vif-plugged-6afaadc4-fd5f-49fa-80df-cf437202f1e3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:22:48 np0005538960 nova_compute[187252]: 2025-11-28 16:22:48.719 187256 WARNING nova.compute.manager [req-2478cff4-7f7b-44b5-bd61-05229423daa0 req-15d6d9ab-97fb-46e1-85f6-cbd8e24c00d5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Received unexpected event network-vif-plugged-6afaadc4-fd5f-49fa-80df-cf437202f1e3 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:22:49 np0005538960 nova_compute[187252]: 2025-11-28 16:22:49.020 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764346954.01909, dab22d6d-ad90-4a53-b395-4d8aa1875048 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:22:49 np0005538960 nova_compute[187252]: 2025-11-28 16:22:49.020 187256 INFO nova.compute.manager [-] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:22:49 np0005538960 nova_compute[187252]: 2025-11-28 16:22:49.052 187256 DEBUG nova.compute.manager [None req-d3e0722c-a7c2-4f4e-b134-ed4de7de9014 - - - - - -] [instance: dab22d6d-ad90-4a53-b395-4d8aa1875048] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:22:49 np0005538960 podman[216901]: 2025-11-28 16:22:49.174395794 +0000 UTC m=+0.076204012 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.license=GPLv2)
Nov 28 11:22:50 np0005538960 nova_compute[187252]: 2025-11-28 16:22:50.995 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:52 np0005538960 podman[216923]: 2025-11-28 16:22:52.158002271 +0000 UTC m=+0.058868318 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:22:52 np0005538960 nova_compute[187252]: 2025-11-28 16:22:52.712 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:56 np0005538960 nova_compute[187252]: 2025-11-28 16:22:55.999 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:57 np0005538960 nova_compute[187252]: 2025-11-28 16:22:57.715 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:22:58 np0005538960 nova_compute[187252]: 2025-11-28 16:22:58.494 187256 DEBUG nova.compute.manager [req-5f5be377-2cf2-4029-acf0-98209d46a33c req-64a707b6-494b-4fea-97a8-eee5ddc03e75 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Received event network-changed-6afaadc4-fd5f-49fa-80df-cf437202f1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:22:58 np0005538960 nova_compute[187252]: 2025-11-28 16:22:58.494 187256 DEBUG nova.compute.manager [req-5f5be377-2cf2-4029-acf0-98209d46a33c req-64a707b6-494b-4fea-97a8-eee5ddc03e75 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Refreshing instance network info cache due to event network-changed-6afaadc4-fd5f-49fa-80df-cf437202f1e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:22:58 np0005538960 nova_compute[187252]: 2025-11-28 16:22:58.494 187256 DEBUG oslo_concurrency.lockutils [req-5f5be377-2cf2-4029-acf0-98209d46a33c req-64a707b6-494b-4fea-97a8-eee5ddc03e75 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:22:58 np0005538960 nova_compute[187252]: 2025-11-28 16:22:58.495 187256 DEBUG oslo_concurrency.lockutils [req-5f5be377-2cf2-4029-acf0-98209d46a33c req-64a707b6-494b-4fea-97a8-eee5ddc03e75 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:22:58 np0005538960 nova_compute[187252]: 2025-11-28 16:22:58.495 187256 DEBUG nova.network.neutron [req-5f5be377-2cf2-4029-acf0-98209d46a33c req-64a707b6-494b-4fea-97a8-eee5ddc03e75 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Refreshing network info cache for port 6afaadc4-fd5f-49fa-80df-cf437202f1e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:22:59 np0005538960 podman[216951]: 2025-11-28 16:22:59.189987689 +0000 UTC m=+0.092330525 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 11:22:59 np0005538960 ovn_controller[95460]: 2025-11-28T16:22:59Z|00082|binding|INFO|Releasing lport ec7a1044-99b4-4f30-9983-8c85fdd93798 from this chassis (sb_readonly=0)
Nov 28 11:22:59 np0005538960 nova_compute[187252]: 2025-11-28 16:22:59.936 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:01 np0005538960 nova_compute[187252]: 2025-11-28 16:23:01.002 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:01 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:01Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:85:a5:6f 10.100.0.5
Nov 28 11:23:01 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:01Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:85:a5:6f 10.100.0.5
Nov 28 11:23:01 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:01.961 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:23:01 np0005538960 nova_compute[187252]: 2025-11-28 16:23:01.961 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:01 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:01.962 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:23:01 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:01.964 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:02 np0005538960 nova_compute[187252]: 2025-11-28 16:23:02.582 187256 DEBUG nova.network.neutron [req-5f5be377-2cf2-4029-acf0-98209d46a33c req-64a707b6-494b-4fea-97a8-eee5ddc03e75 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updated VIF entry in instance network info cache for port 6afaadc4-fd5f-49fa-80df-cf437202f1e3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:23:02 np0005538960 nova_compute[187252]: 2025-11-28 16:23:02.583 187256 DEBUG nova.network.neutron [req-5f5be377-2cf2-4029-acf0-98209d46a33c req-64a707b6-494b-4fea-97a8-eee5ddc03e75 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updating instance_info_cache with network_info: [{"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:23:02 np0005538960 nova_compute[187252]: 2025-11-28 16:23:02.608 187256 DEBUG oslo_concurrency.lockutils [req-5f5be377-2cf2-4029-acf0-98209d46a33c req-64a707b6-494b-4fea-97a8-eee5ddc03e75 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:23:02 np0005538960 nova_compute[187252]: 2025-11-28 16:23:02.717 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:03 np0005538960 podman[216991]: 2025-11-28 16:23:03.149636155 +0000 UTC m=+0.048562917 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Nov 28 11:23:03 np0005538960 podman[216990]: 2025-11-28 16:23:03.182686293 +0000 UTC m=+0.085918290 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 11:23:06 np0005538960 nova_compute[187252]: 2025-11-28 16:23:06.005 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:06.343 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:06.344 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:06.344 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:07 np0005538960 nova_compute[187252]: 2025-11-28 16:23:07.719 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:08 np0005538960 podman[217028]: 2025-11-28 16:23:08.175599393 +0000 UTC m=+0.070557104 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:23:10 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:10Z|00083|binding|INFO|Releasing lport ec7a1044-99b4-4f30-9983-8c85fdd93798 from this chassis (sb_readonly=0)
Nov 28 11:23:10 np0005538960 nova_compute[187252]: 2025-11-28 16:23:10.168 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:11 np0005538960 nova_compute[187252]: 2025-11-28 16:23:11.008 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:11 np0005538960 podman[217052]: 2025-11-28 16:23:11.19527714 +0000 UTC m=+0.091161166 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, version=9.6, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:23:12 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:12Z|00084|binding|INFO|Releasing lport ec7a1044-99b4-4f30-9983-8c85fdd93798 from this chassis (sb_readonly=0)
Nov 28 11:23:12 np0005538960 nova_compute[187252]: 2025-11-28 16:23:12.300 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:12 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:12Z|00085|binding|INFO|Releasing lport ec7a1044-99b4-4f30-9983-8c85fdd93798 from this chassis (sb_readonly=0)
Nov 28 11:23:12 np0005538960 nova_compute[187252]: 2025-11-28 16:23:12.516 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:12 np0005538960 nova_compute[187252]: 2025-11-28 16:23:12.722 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:15 np0005538960 nova_compute[187252]: 2025-11-28 16:23:15.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:15 np0005538960 nova_compute[187252]: 2025-11-28 16:23:15.314 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:23:16 np0005538960 nova_compute[187252]: 2025-11-28 16:23:16.011 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:16 np0005538960 nova_compute[187252]: 2025-11-28 16:23:16.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.354 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.354 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.354 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.355 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.446 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.505 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.506 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.557 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.588 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.697 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.698 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5569MB free_disk=73.31377410888672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.698 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.699 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.723 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.779 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance a59a72c8-b3b5-407f-8a7a-f939a34b5c75 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.779 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.780 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.842 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.859 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.884 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:23:17 np0005538960 nova_compute[187252]: 2025-11-28 16:23:17.884 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:18 np0005538960 nova_compute[187252]: 2025-11-28 16:23:18.879 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:19 np0005538960 nova_compute[187252]: 2025-11-28 16:23:19.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:19 np0005538960 nova_compute[187252]: 2025-11-28 16:23:19.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:23:19 np0005538960 nova_compute[187252]: 2025-11-28 16:23:19.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:23:20 np0005538960 nova_compute[187252]: 2025-11-28 16:23:20.155 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:23:20 np0005538960 nova_compute[187252]: 2025-11-28 16:23:20.155 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:23:20 np0005538960 nova_compute[187252]: 2025-11-28 16:23:20.156 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:23:20 np0005538960 nova_compute[187252]: 2025-11-28 16:23:20.156 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a59a72c8-b3b5-407f-8a7a-f939a34b5c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:23:20 np0005538960 podman[217082]: 2025-11-28 16:23:20.175226512 +0000 UTC m=+0.072676025 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 11:23:21 np0005538960 nova_compute[187252]: 2025-11-28 16:23:21.014 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:22 np0005538960 nova_compute[187252]: 2025-11-28 16:23:22.725 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:23 np0005538960 podman[217102]: 2025-11-28 16:23:23.1526971 +0000 UTC m=+0.057274438 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:23:25 np0005538960 nova_compute[187252]: 2025-11-28 16:23:25.454 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updating instance_info_cache with network_info: [{"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:23:25 np0005538960 nova_compute[187252]: 2025-11-28 16:23:25.478 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:23:25 np0005538960 nova_compute[187252]: 2025-11-28 16:23:25.478 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:23:25 np0005538960 nova_compute[187252]: 2025-11-28 16:23:25.478 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:25 np0005538960 nova_compute[187252]: 2025-11-28 16:23:25.479 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:25 np0005538960 nova_compute[187252]: 2025-11-28 16:23:25.479 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:25 np0005538960 nova_compute[187252]: 2025-11-28 16:23:25.483 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:26 np0005538960 nova_compute[187252]: 2025-11-28 16:23:26.017 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:26 np0005538960 nova_compute[187252]: 2025-11-28 16:23:26.474 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:23:27 np0005538960 nova_compute[187252]: 2025-11-28 16:23:27.728 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:28 np0005538960 nova_compute[187252]: 2025-11-28 16:23:28.139 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:30 np0005538960 podman[217129]: 2025-11-28 16:23:30.218842204 +0000 UTC m=+0.109957065 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:23:31 np0005538960 nova_compute[187252]: 2025-11-28 16:23:31.020 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:32 np0005538960 nova_compute[187252]: 2025-11-28 16:23:32.730 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:33 np0005538960 nova_compute[187252]: 2025-11-28 16:23:33.053 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:33 np0005538960 nova_compute[187252]: 2025-11-28 16:23:33.619 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:34 np0005538960 podman[217154]: 2025-11-28 16:23:34.148466726 +0000 UTC m=+0.054673406 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:23:34 np0005538960 podman[217155]: 2025-11-28 16:23:34.155753533 +0000 UTC m=+0.058730654 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 11:23:36 np0005538960 nova_compute[187252]: 2025-11-28 16:23:36.022 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:36 np0005538960 nova_compute[187252]: 2025-11-28 16:23:36.426 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:37 np0005538960 nova_compute[187252]: 2025-11-28 16:23:37.733 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:37 np0005538960 nova_compute[187252]: 2025-11-28 16:23:37.859 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:37 np0005538960 nova_compute[187252]: 2025-11-28 16:23:37.860 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:37 np0005538960 nova_compute[187252]: 2025-11-28 16:23:37.888 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:23:37 np0005538960 nova_compute[187252]: 2025-11-28 16:23:37.964 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:37 np0005538960 nova_compute[187252]: 2025-11-28 16:23:37.965 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:37 np0005538960 nova_compute[187252]: 2025-11-28 16:23:37.973 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:23:37 np0005538960 nova_compute[187252]: 2025-11-28 16:23:37.973 187256 INFO nova.compute.claims [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.113 187256 DEBUG nova.compute.provider_tree [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:23:38 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:38.114 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:66:6f 10.100.0.2 2001:db8::f816:3eff:fec1:666f'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec1:666f/64', 'neutron:device_id': 'ovnmeta-2e8458a4-db08-449a-a189-ad3fbf952e94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e8458a4-db08-449a-a189-ad3fbf952e94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=796fbcf6-98ea-4eba-928e-3f69e912680b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f74f0f92-a0ab-4c68-8e9a-0baf8df6fcac) old=Port_Binding(mac=['fa:16:3e:c1:66:6f 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2e8458a4-db08-449a-a189-ad3fbf952e94', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2e8458a4-db08-449a-a189-ad3fbf952e94', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:23:38 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:38.116 104369 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f74f0f92-a0ab-4c68-8e9a-0baf8df6fcac in datapath 2e8458a4-db08-449a-a189-ad3fbf952e94 updated#033[00m
Nov 28 11:23:38 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:38.117 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2e8458a4-db08-449a-a189-ad3fbf952e94, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:23:38 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:38.118 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e983ad0c-54a7-455b-b949-3c1216c88dfa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.137 187256 DEBUG nova.scheduler.client.report [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.158 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.159 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.203 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.203 187256 DEBUG nova.network.neutron [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.224 187256 INFO nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.241 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.320 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.322 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.323 187256 INFO nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Creating image(s)#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.323 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "/var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.323 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.324 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.341 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.405 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.406 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.407 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.419 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.442 187256 DEBUG nova.policy [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.479 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.480 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.534 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.536 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.537 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.599 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.600 187256 DEBUG nova.virt.disk.api [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Checking if we can resize image /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.600 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.660 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.661 187256 DEBUG nova.virt.disk.api [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Cannot resize image /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.662 187256 DEBUG nova.objects.instance [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'migration_context' on Instance uuid cd030167-3f92-4d3e-9ecc-3be8d39ada4f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.687 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.688 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Ensure instance console log exists: /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.688 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.688 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:38 np0005538960 nova_compute[187252]: 2025-11-28 16:23:38.689 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:39 np0005538960 podman[217206]: 2025-11-28 16:23:39.161828165 +0000 UTC m=+0.063187044 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:23:41 np0005538960 nova_compute[187252]: 2025-11-28 16:23:41.025 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:42 np0005538960 nova_compute[187252]: 2025-11-28 16:23:42.044 187256 DEBUG nova.network.neutron [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Successfully created port: e625ba45-02b0-4430-8aff-d9674489e676 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:23:42 np0005538960 podman[217231]: 2025-11-28 16:23:42.154808353 +0000 UTC m=+0.062941308 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Nov 28 11:23:42 np0005538960 nova_compute[187252]: 2025-11-28 16:23:42.736 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:43 np0005538960 nova_compute[187252]: 2025-11-28 16:23:43.221 187256 DEBUG nova.network.neutron [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Successfully updated port: e625ba45-02b0-4430-8aff-d9674489e676 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:23:43 np0005538960 nova_compute[187252]: 2025-11-28 16:23:43.234 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:23:43 np0005538960 nova_compute[187252]: 2025-11-28 16:23:43.234 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquired lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:23:43 np0005538960 nova_compute[187252]: 2025-11-28 16:23:43.234 187256 DEBUG nova.network.neutron [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:23:43 np0005538960 nova_compute[187252]: 2025-11-28 16:23:43.465 187256 DEBUG nova.network.neutron [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:23:43 np0005538960 nova_compute[187252]: 2025-11-28 16:23:43.499 187256 DEBUG nova.compute.manager [req-f6093e24-6cc8-4ef9-b0f8-6a9177e0203f req-c68b58d1-5d5b-4da8-b8cb-aca924532add 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-changed-e625ba45-02b0-4430-8aff-d9674489e676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:23:43 np0005538960 nova_compute[187252]: 2025-11-28 16:23:43.499 187256 DEBUG nova.compute.manager [req-f6093e24-6cc8-4ef9-b0f8-6a9177e0203f req-c68b58d1-5d5b-4da8-b8cb-aca924532add 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Refreshing instance network info cache due to event network-changed-e625ba45-02b0-4430-8aff-d9674489e676. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:23:43 np0005538960 nova_compute[187252]: 2025-11-28 16:23:43.499 187256 DEBUG oslo_concurrency.lockutils [req-f6093e24-6cc8-4ef9-b0f8-6a9177e0203f req-c68b58d1-5d5b-4da8-b8cb-aca924532add 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.589 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.590 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.590 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.591 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.591 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.592 187256 INFO nova.compute.manager [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Terminating instance#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.593 187256 DEBUG nova.compute.manager [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:23:44 np0005538960 kernel: tap6afaadc4-fd (unregistering): left promiscuous mode
Nov 28 11:23:44 np0005538960 NetworkManager[55548]: <info>  [1764347024.6217] device (tap6afaadc4-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.625 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:44Z|00086|binding|INFO|Releasing lport 6afaadc4-fd5f-49fa-80df-cf437202f1e3 from this chassis (sb_readonly=0)
Nov 28 11:23:44 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:44Z|00087|binding|INFO|Setting lport 6afaadc4-fd5f-49fa-80df-cf437202f1e3 down in Southbound
Nov 28 11:23:44 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:44Z|00088|binding|INFO|Removing iface tap6afaadc4-fd ovn-installed in OVS
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.628 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.639 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:a5:6f 10.100.0.5'], port_security=['fa:16:3e:85:a5:6f 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'a59a72c8-b3b5-407f-8a7a-f939a34b5c75', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b95cbfd446a3402a9845b8e54a0539b1', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3dc8fbc7-f735-4159-bfeb-4b9a5ad39eb4 c5cc017c-c3ae-4527-bcc3-ef0b586790a1', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2ceedba3-308b-43c0-901a-53307d08facf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=6afaadc4-fd5f-49fa-80df-cf437202f1e3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.642 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 6afaadc4-fd5f-49fa-80df-cf437202f1e3 in datapath a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d unbound from our chassis#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.643 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.644 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[bfe9520f-f126-410d-bed5-20c4074d28f4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.645 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d namespace which is not needed anymore#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.651 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000010.scope: Deactivated successfully.
Nov 28 11:23:44 np0005538960 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000010.scope: Consumed 16.069s CPU time.
Nov 28 11:23:44 np0005538960 systemd-machined[153518]: Machine qemu-6-instance-00000010 terminated.
Nov 28 11:23:44 np0005538960 neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d[216886]: [NOTICE]   (216890) : haproxy version is 2.8.14-c23fe91
Nov 28 11:23:44 np0005538960 neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d[216886]: [NOTICE]   (216890) : path to executable is /usr/sbin/haproxy
Nov 28 11:23:44 np0005538960 neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d[216886]: [WARNING]  (216890) : Exiting Master process...
Nov 28 11:23:44 np0005538960 neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d[216886]: [ALERT]    (216890) : Current worker (216892) exited with code 143 (Terminated)
Nov 28 11:23:44 np0005538960 neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d[216886]: [WARNING]  (216890) : All workers exited. Exiting... (0)
Nov 28 11:23:44 np0005538960 systemd[1]: libpod-9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59.scope: Deactivated successfully.
Nov 28 11:23:44 np0005538960 podman[217278]: 2025-11-28 16:23:44.794119105 +0000 UTC m=+0.058410098 container died 9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.823 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.830 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59-userdata-shm.mount: Deactivated successfully.
Nov 28 11:23:44 np0005538960 systemd[1]: var-lib-containers-storage-overlay-9a0cce5f49af2bd13705321e62d1cf9a5dec24c69323a5765bff0e52228f4c80-merged.mount: Deactivated successfully.
Nov 28 11:23:44 np0005538960 podman[217278]: 2025-11-28 16:23:44.856061956 +0000 UTC m=+0.120352949 container cleanup 9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:23:44 np0005538960 systemd[1]: libpod-conmon-9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59.scope: Deactivated successfully.
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.870 187256 INFO nova.virt.libvirt.driver [-] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Instance destroyed successfully.#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.872 187256 DEBUG nova.objects.instance [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lazy-loading 'resources' on Instance uuid a59a72c8-b3b5-407f-8a7a-f939a34b5c75 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.887 187256 DEBUG nova.virt.libvirt.vif [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:22:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-890711526-access_point-863719353',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-890711526-access_point-863719353',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-890711526-acc',id=16,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL4T/R7E/vFcSeE0fzLvkiSoTLLYGisURb0A/0geduwhytTpHsgju71rV8vWYR0ERy2GKPWTieugKB6vZ3VOh75uVQKt+R6zhf4sxILFG2K8YDw/c++kFBMV6Ggxna9ifg==',key_name='tempest-TestSecurityGroupsBasicOps-1568398266',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:22:46Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b95cbfd446a3402a9845b8e54a0539b1',ramdisk_id='',reservation_id='r-pibpvlng',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-890711526',owner_user_name='tempest-TestSecurityGroupsBasicOps-890711526-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:22:46Z,user_data=None,user_id='070b2b4a7f634b70a38a8f51ce54dd63',uuid=a59a72c8-b3b5-407f-8a7a-f939a34b5c75,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.887 187256 DEBUG nova.network.os_vif_util [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Converting VIF {"id": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "address": "fa:16:3e:85:a5:6f", "network": {"id": "a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d", "bridge": "br-int", "label": "tempest-network-smoke--2035867680", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b95cbfd446a3402a9845b8e54a0539b1", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6afaadc4-fd", "ovs_interfaceid": "6afaadc4-fd5f-49fa-80df-cf437202f1e3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.888 187256 DEBUG nova.network.os_vif_util [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:a5:6f,bridge_name='br-int',has_traffic_filtering=True,id=6afaadc4-fd5f-49fa-80df-cf437202f1e3,network=Network(a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6afaadc4-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.889 187256 DEBUG os_vif [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:a5:6f,bridge_name='br-int',has_traffic_filtering=True,id=6afaadc4-fd5f-49fa-80df-cf437202f1e3,network=Network(a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6afaadc4-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.890 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.891 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6afaadc4-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.892 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.894 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.897 187256 INFO os_vif [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:a5:6f,bridge_name='br-int',has_traffic_filtering=True,id=6afaadc4-fd5f-49fa-80df-cf437202f1e3,network=Network(a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6afaadc4-fd')#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.898 187256 INFO nova.virt.libvirt.driver [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Deleting instance files /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75_del#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.899 187256 INFO nova.virt.libvirt.driver [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Deletion of /var/lib/nova/instances/a59a72c8-b3b5-407f-8a7a-f939a34b5c75_del complete#033[00m
Nov 28 11:23:44 np0005538960 podman[217322]: 2025-11-28 16:23:44.931113819 +0000 UTC m=+0.049192322 container remove 9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.940 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7565f6d6-0380-463a-a4b4-5ddca24b8f1f]: (4, ('Fri Nov 28 04:23:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d (9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59)\n9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59\nFri Nov 28 04:23:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d (9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59)\n9db69fdce0adccf66e46794b36990761d09a2f160c30ea024c56fc3e3b5b2b59\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.942 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3f7410c7-9810-4fce-aa6a-451b375da746]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.943 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4ee5c01-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.945 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 kernel: tapa4ee5c01-b0: left promiscuous mode
Nov 28 11:23:44 np0005538960 nova_compute[187252]: 2025-11-28 16:23:44.969 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.973 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[af2e0484-ff51-44b8-acf4-dc4954977dc3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.992 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1914fbe5-2361-4764-9a1a-5c1192e6f4d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:44 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:44.993 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6a95fead-ce17-4dce-ae5b-e4817fd9a4a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:45 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:45.009 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d15cc3ce-e84e-4574-8ec3-6b1ec918087a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406214, 'reachable_time': 22452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217336, 'error': None, 'target': 'ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:45 np0005538960 systemd[1]: run-netns-ovnmeta\x2da4ee5c01\x2db753\x2d4543\x2d8f0b\x2dc1f2f8f8b63d.mount: Deactivated successfully.
Nov 28 11:23:45 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:45.012 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a4ee5c01-b753-4543-8f0b-c1f2f8f8b63d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:23:45 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:45.013 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[62651ef5-9c16-4463-b5eb-83abc9ea2b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.013 187256 INFO nova.compute.manager [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.014 187256 DEBUG oslo.service.loopingcall [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.014 187256 DEBUG nova.compute.manager [-] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.015 187256 DEBUG nova.network.neutron [-] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.540 187256 DEBUG nova.network.neutron [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updating instance_info_cache with network_info: [{"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.750 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Releasing lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.751 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Instance network_info: |[{"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.751 187256 DEBUG oslo_concurrency.lockutils [req-f6093e24-6cc8-4ef9-b0f8-6a9177e0203f req-c68b58d1-5d5b-4da8-b8cb-aca924532add 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.752 187256 DEBUG nova.network.neutron [req-f6093e24-6cc8-4ef9-b0f8-6a9177e0203f req-c68b58d1-5d5b-4da8-b8cb-aca924532add 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Refreshing network info cache for port e625ba45-02b0-4430-8aff-d9674489e676 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.756 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Start _get_guest_xml network_info=[{"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.761 187256 WARNING nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.767 187256 DEBUG nova.virt.libvirt.host [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.768 187256 DEBUG nova.virt.libvirt.host [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.773 187256 DEBUG nova.virt.libvirt.host [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.773 187256 DEBUG nova.virt.libvirt.host [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.775 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.775 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.775 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.776 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.776 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.776 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.776 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.776 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.777 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.777 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.777 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.777 187256 DEBUG nova.virt.hardware [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.781 187256 DEBUG nova.virt.libvirt.vif [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:23:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1892971024',display_name='tempest-TestNetworkBasicOps-server-1892971024',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1892971024',id=19,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG/7odxlfRfrLDu3Z3S+r1CDscLCljTk3aaVGgDh0PB+Og4NIJJkQsoCw6yTulRPk+DuAcq538jfu8LteEqYkv00icBqxxc1fJLJW3MCC/5DqlVFaqPgNv7nxJX8KuZdQg==',key_name='tempest-TestNetworkBasicOps-709816810',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-dlzv8dgb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:23:38Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=cd030167-3f92-4d3e-9ecc-3be8d39ada4f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.781 187256 DEBUG nova.network.os_vif_util [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.782 187256 DEBUG nova.network.os_vif_util [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:d5:83,bridge_name='br-int',has_traffic_filtering=True,id=e625ba45-02b0-4430-8aff-d9674489e676,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape625ba45-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.782 187256 DEBUG nova.objects.instance [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'pci_devices' on Instance uuid cd030167-3f92-4d3e-9ecc-3be8d39ada4f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.794 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <uuid>cd030167-3f92-4d3e-9ecc-3be8d39ada4f</uuid>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <name>instance-00000013</name>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkBasicOps-server-1892971024</nova:name>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:23:45</nova:creationTime>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        <nova:port uuid="e625ba45-02b0-4430-8aff-d9674489e676">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <entry name="serial">cd030167-3f92-4d3e-9ecc-3be8d39ada4f</entry>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <entry name="uuid">cd030167-3f92-4d3e-9ecc-3be8d39ada4f</entry>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.config"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:0e:d5:83"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <target dev="tape625ba45-02"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/console.log" append="off"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:23:45 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:23:45 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:23:45 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:23:45 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.796 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Preparing to wait for external event network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.796 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.797 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.797 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.798 187256 DEBUG nova.virt.libvirt.vif [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:23:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1892971024',display_name='tempest-TestNetworkBasicOps-server-1892971024',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1892971024',id=19,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG/7odxlfRfrLDu3Z3S+r1CDscLCljTk3aaVGgDh0PB+Og4NIJJkQsoCw6yTulRPk+DuAcq538jfu8LteEqYkv00icBqxxc1fJLJW3MCC/5DqlVFaqPgNv7nxJX8KuZdQg==',key_name='tempest-TestNetworkBasicOps-709816810',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-dlzv8dgb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:23:38Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=cd030167-3f92-4d3e-9ecc-3be8d39ada4f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.798 187256 DEBUG nova.network.os_vif_util [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.799 187256 DEBUG nova.network.os_vif_util [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:d5:83,bridge_name='br-int',has_traffic_filtering=True,id=e625ba45-02b0-4430-8aff-d9674489e676,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape625ba45-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.800 187256 DEBUG os_vif [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:d5:83,bridge_name='br-int',has_traffic_filtering=True,id=e625ba45-02b0-4430-8aff-d9674489e676,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape625ba45-02') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.800 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.801 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.801 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.803 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.804 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape625ba45-02, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.804 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape625ba45-02, col_values=(('external_ids', {'iface-id': 'e625ba45-02b0-4430-8aff-d9674489e676', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:d5:83', 'vm-uuid': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.806 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:45 np0005538960 NetworkManager[55548]: <info>  [1764347025.8081] manager: (tape625ba45-02): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.808 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.813 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.814 187256 INFO os_vif [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:d5:83,bridge_name='br-int',has_traffic_filtering=True,id=e625ba45-02b0-4430-8aff-d9674489e676,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape625ba45-02')#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.867 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.867 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.868 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No VIF found with MAC fa:16:3e:0e:d5:83, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:23:45 np0005538960 nova_compute[187252]: 2025-11-28 16:23:45.869 187256 INFO nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Using config drive#033[00m
Nov 28 11:23:46 np0005538960 nova_compute[187252]: 2025-11-28 16:23:46.736 187256 INFO nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Creating config drive at /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.config#033[00m
Nov 28 11:23:46 np0005538960 nova_compute[187252]: 2025-11-28 16:23:46.741 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpup92x12q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:23:46 np0005538960 nova_compute[187252]: 2025-11-28 16:23:46.868 187256 DEBUG oslo_concurrency.processutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpup92x12q" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:23:46 np0005538960 kernel: tape625ba45-02: entered promiscuous mode
Nov 28 11:23:46 np0005538960 NetworkManager[55548]: <info>  [1764347026.9509] manager: (tape625ba45-02): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Nov 28 11:23:46 np0005538960 systemd-udevd[217258]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:23:46 np0005538960 nova_compute[187252]: 2025-11-28 16:23:46.950 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:46Z|00089|binding|INFO|Claiming lport e625ba45-02b0-4430-8aff-d9674489e676 for this chassis.
Nov 28 11:23:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:46Z|00090|binding|INFO|e625ba45-02b0-4430-8aff-d9674489e676: Claiming fa:16:3e:0e:d5:83 10.100.0.6
Nov 28 11:23:46 np0005538960 NetworkManager[55548]: <info>  [1764347026.9638] device (tape625ba45-02): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:23:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:46Z|00091|binding|INFO|Setting lport e625ba45-02b0-4430-8aff-d9674489e676 ovn-installed in OVS
Nov 28 11:23:46 np0005538960 NetworkManager[55548]: <info>  [1764347026.9653] device (tape625ba45-02): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:23:46 np0005538960 nova_compute[187252]: 2025-11-28 16:23:46.966 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:46Z|00092|binding|INFO|Setting lport e625ba45-02b0-4430-8aff-d9674489e676 up in Southbound
Nov 28 11:23:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:46.975 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:d5:83 10.100.0.6'], port_security=['fa:16:3e:0e:d5:83 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57ac1780-c5b4-496c-8b5e-7335798054b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd2079cea-1e9f-41b5-816a-6c797998eaed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4d55fe3-0802-461e-86b4-090e5688fd31, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=e625ba45-02b0-4430-8aff-d9674489e676) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:23:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:46.977 104369 INFO neutron.agent.ovn.metadata.agent [-] Port e625ba45-02b0-4430-8aff-d9674489e676 in datapath 57ac1780-c5b4-496c-8b5e-7335798054b3 bound to our chassis#033[00m
Nov 28 11:23:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:46.978 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 57ac1780-c5b4-496c-8b5e-7335798054b3#033[00m
Nov 28 11:23:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:46.994 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[358f8b59-3b1e-4b7b-abbf-d3c7fe7fa764]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:46.995 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap57ac1780-c1 in ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:23:46 np0005538960 systemd-machined[153518]: New machine qemu-7-instance-00000013.
Nov 28 11:23:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:46.998 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap57ac1780-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:46.998 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[55ef4bd5-a409-47fa-abe1-2f0643b026be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.000 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ab27c651-c5d7-47fb-8a0e-c0a9db16c673]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 systemd[1]: Started Virtual Machine qemu-7-instance-00000013.
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.014 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[2348302f-c2d2-40fe-acdd-13174ac8602b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.027 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[13b8e106-e519-4b25-88f3-5fda3f604b88]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.060 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[ebda9bbb-6b39-43bc-b7c8-c5bbb17f5e45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.067 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a5650a42-9e18-4be3-83ec-245200bc4589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 NetworkManager[55548]: <info>  [1764347027.0682] manager: (tap57ac1780-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.096 187256 DEBUG nova.network.neutron [-] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.095 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[17ec8dc3-0baf-4102-9872-c3751e08d079]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.100 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[4415c763-26f4-4b0d-be6e-ca9eec21940b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 NetworkManager[55548]: <info>  [1764347027.1166] device (tap57ac1780-c0): carrier: link connected
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.120 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[32a704c2-eb70-4fc0-8dd4-4536c3f6a9de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.138 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1eac0610-b53a-4ebe-9550-15bee1bb0f20]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57ac1780-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:05:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412669, 'reachable_time': 19610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217388, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.155 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[50003a71-35bc-46c1-a944-8ddc02c0f320]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb5:5eb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412669, 'tstamp': 412669}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217389, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.175 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f19798ca-def3-4727-982d-2fd3168154bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57ac1780-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:05:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412669, 'reachable_time': 19610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217390, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.211 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[af1801f3-44ef-418a-a469-31239f445cd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.292 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9e33c4cd-8df1-4145-8128-b958b7c5c08f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.294 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57ac1780-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.294 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.294 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57ac1780-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.333 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:47 np0005538960 NetworkManager[55548]: <info>  [1764347027.3345] manager: (tap57ac1780-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Nov 28 11:23:47 np0005538960 kernel: tap57ac1780-c0: entered promiscuous mode
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.337 187256 INFO nova.compute.manager [-] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Took 2.32 seconds to deallocate network for instance.#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.337 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.346 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap57ac1780-c0, col_values=(('external_ids', {'iface-id': '59b9a1a2-0895-427c-9467-c27cddf3180e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.347 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:47 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:47Z|00093|binding|INFO|Releasing lport 59b9a1a2-0895-427c-9467-c27cddf3180e from this chassis (sb_readonly=0)
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.349 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.352 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/57ac1780-c5b4-496c-8b5e-7335798054b3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/57ac1780-c5b4-496c-8b5e-7335798054b3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.353 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[75637c81-da3f-45d4-8a90-cf3747333378]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.354 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-57ac1780-c5b4-496c-8b5e-7335798054b3
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/57ac1780-c5b4-496c-8b5e-7335798054b3.pid.haproxy
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID 57ac1780-c5b4-496c-8b5e-7335798054b3
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:23:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:23:47.355 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'env', 'PROCESS_TAG=haproxy-57ac1780-c5b4-496c-8b5e-7335798054b3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/57ac1780-c5b4-496c-8b5e-7335798054b3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.372 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.506 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347027.5050967, cd030167-3f92-4d3e-9ecc-3be8d39ada4f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.507 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] VM Started (Lifecycle Event)#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.726 187256 DEBUG nova.compute.manager [req-7ff37ec9-8fc3-450e-b216-7b2f4e7ac8cb req-93b1d12f-e0b9-452d-979f-18dd31b94716 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Received event network-changed-6afaadc4-fd5f-49fa-80df-cf437202f1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.728 187256 DEBUG nova.compute.manager [req-7ff37ec9-8fc3-450e-b216-7b2f4e7ac8cb req-93b1d12f-e0b9-452d-979f-18dd31b94716 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Refreshing instance network info cache due to event network-changed-6afaadc4-fd5f-49fa-80df-cf437202f1e3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.728 187256 DEBUG oslo_concurrency.lockutils [req-7ff37ec9-8fc3-450e-b216-7b2f4e7ac8cb req-93b1d12f-e0b9-452d-979f-18dd31b94716 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.729 187256 DEBUG oslo_concurrency.lockutils [req-7ff37ec9-8fc3-450e-b216-7b2f4e7ac8cb req-93b1d12f-e0b9-452d-979f-18dd31b94716 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.729 187256 DEBUG nova.network.neutron [req-7ff37ec9-8fc3-450e-b216-7b2f4e7ac8cb req-93b1d12f-e0b9-452d-979f-18dd31b94716 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Refreshing network info cache for port 6afaadc4-fd5f-49fa-80df-cf437202f1e3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.737 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.747 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.758 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347027.506775, cd030167-3f92-4d3e-9ecc-3be8d39ada4f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.758 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.762 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.762 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:47 np0005538960 podman[217429]: 2025-11-28 16:23:47.789308425 +0000 UTC m=+0.077907023 container create ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.792 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.796 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.825 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:23:47 np0005538960 podman[217429]: 2025-11-28 16:23:47.741307833 +0000 UTC m=+0.029906441 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.871 187256 DEBUG nova.compute.provider_tree [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:23:47 np0005538960 systemd[1]: Started libpod-conmon-ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276.scope.
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.897 187256 DEBUG nova.scheduler.client.report [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:23:47 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:23:47 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ea4f1003ea2ce1e09abbc02b09010d4aacd02f5aecd4ca1c88dc24de76f392df/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.929 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:47 np0005538960 podman[217429]: 2025-11-28 16:23:47.936437176 +0000 UTC m=+0.225035794 container init ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:23:47 np0005538960 podman[217429]: 2025-11-28 16:23:47.943238822 +0000 UTC m=+0.231837410 container start ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.951 187256 INFO nova.scheduler.client.report [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Deleted allocations for instance a59a72c8-b3b5-407f-8a7a-f939a34b5c75#033[00m
Nov 28 11:23:47 np0005538960 nova_compute[187252]: 2025-11-28 16:23:47.965 187256 DEBUG nova.network.neutron [req-7ff37ec9-8fc3-450e-b216-7b2f4e7ac8cb req-93b1d12f-e0b9-452d-979f-18dd31b94716 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:23:47 np0005538960 neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3[217444]: [NOTICE]   (217448) : New worker (217450) forked
Nov 28 11:23:47 np0005538960 neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3[217444]: [NOTICE]   (217448) : Loading success.
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.007 187256 DEBUG nova.compute.manager [req-6c985fc3-fa81-46c3-af4d-436ed968dfa6 req-d3b9f6ff-675a-4a2a-abb0-09cfe7cdb2c2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.007 187256 DEBUG oslo_concurrency.lockutils [req-6c985fc3-fa81-46c3-af4d-436ed968dfa6 req-d3b9f6ff-675a-4a2a-abb0-09cfe7cdb2c2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.008 187256 DEBUG oslo_concurrency.lockutils [req-6c985fc3-fa81-46c3-af4d-436ed968dfa6 req-d3b9f6ff-675a-4a2a-abb0-09cfe7cdb2c2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.008 187256 DEBUG oslo_concurrency.lockutils [req-6c985fc3-fa81-46c3-af4d-436ed968dfa6 req-d3b9f6ff-675a-4a2a-abb0-09cfe7cdb2c2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.008 187256 DEBUG nova.compute.manager [req-6c985fc3-fa81-46c3-af4d-436ed968dfa6 req-d3b9f6ff-675a-4a2a-abb0-09cfe7cdb2c2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Processing event network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.009 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.015 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347028.0151665, cd030167-3f92-4d3e-9ecc-3be8d39ada4f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.015 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.018 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.021 187256 INFO nova.virt.libvirt.driver [-] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Instance spawned successfully.#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.021 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.037 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.046 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.050 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.051 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.051 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.052 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.052 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.053 187256 DEBUG nova.virt.libvirt.driver [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.058 187256 DEBUG oslo_concurrency.lockutils [None req-42e5d023-953e-4d0f-95f6-8ede8903dd88 070b2b4a7f634b70a38a8f51ce54dd63 b95cbfd446a3402a9845b8e54a0539b1 - - default default] Lock "a59a72c8-b3b5-407f-8a7a-f939a34b5c75" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.468s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.149 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.227 187256 INFO nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Took 9.91 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.227 187256 DEBUG nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.308 187256 INFO nova.compute.manager [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Took 10.38 seconds to build instance.#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.335 187256 DEBUG oslo_concurrency.lockutils [None req-a0e46b4b-0fae-4695-94c8-65c4c16b9653 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.475s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.561 187256 DEBUG nova.network.neutron [req-f6093e24-6cc8-4ef9-b0f8-6a9177e0203f req-c68b58d1-5d5b-4da8-b8cb-aca924532add 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updated VIF entry in instance network info cache for port e625ba45-02b0-4430-8aff-d9674489e676. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.562 187256 DEBUG nova.network.neutron [req-f6093e24-6cc8-4ef9-b0f8-6a9177e0203f req-c68b58d1-5d5b-4da8-b8cb-aca924532add 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updating instance_info_cache with network_info: [{"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.584 187256 DEBUG oslo_concurrency.lockutils [req-f6093e24-6cc8-4ef9-b0f8-6a9177e0203f req-c68b58d1-5d5b-4da8-b8cb-aca924532add 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.663 187256 DEBUG nova.network.neutron [req-7ff37ec9-8fc3-450e-b216-7b2f4e7ac8cb req-93b1d12f-e0b9-452d-979f-18dd31b94716 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:23:48 np0005538960 nova_compute[187252]: 2025-11-28 16:23:48.696 187256 DEBUG oslo_concurrency.lockutils [req-7ff37ec9-8fc3-450e-b216-7b2f4e7ac8cb req-93b1d12f-e0b9-452d-979f-18dd31b94716 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-a59a72c8-b3b5-407f-8a7a-f939a34b5c75" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:23:50 np0005538960 nova_compute[187252]: 2025-11-28 16:23:50.219 187256 DEBUG nova.compute.manager [req-876603cb-6743-4013-93c7-8d4e98103a47 req-b94b3327-1c5a-4d8d-b4c5-f49b8996194c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Received event network-vif-deleted-6afaadc4-fd5f-49fa-80df-cf437202f1e3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:23:50 np0005538960 nova_compute[187252]: 2025-11-28 16:23:50.318 187256 DEBUG nova.compute.manager [req-1528e48f-11ee-45e8-8fae-dc15a7eef8be req-0d8476c8-c8e8-4cfb-b620-4c27a425de5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:23:50 np0005538960 nova_compute[187252]: 2025-11-28 16:23:50.319 187256 DEBUG oslo_concurrency.lockutils [req-1528e48f-11ee-45e8-8fae-dc15a7eef8be req-0d8476c8-c8e8-4cfb-b620-4c27a425de5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:23:50 np0005538960 nova_compute[187252]: 2025-11-28 16:23:50.320 187256 DEBUG oslo_concurrency.lockutils [req-1528e48f-11ee-45e8-8fae-dc15a7eef8be req-0d8476c8-c8e8-4cfb-b620-4c27a425de5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:23:50 np0005538960 nova_compute[187252]: 2025-11-28 16:23:50.321 187256 DEBUG oslo_concurrency.lockutils [req-1528e48f-11ee-45e8-8fae-dc15a7eef8be req-0d8476c8-c8e8-4cfb-b620-4c27a425de5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:23:50 np0005538960 nova_compute[187252]: 2025-11-28 16:23:50.321 187256 DEBUG nova.compute.manager [req-1528e48f-11ee-45e8-8fae-dc15a7eef8be req-0d8476c8-c8e8-4cfb-b620-4c27a425de5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] No waiting events found dispatching network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:23:50 np0005538960 nova_compute[187252]: 2025-11-28 16:23:50.322 187256 WARNING nova.compute.manager [req-1528e48f-11ee-45e8-8fae-dc15a7eef8be req-0d8476c8-c8e8-4cfb-b620-4c27a425de5b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received unexpected event network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:23:50 np0005538960 nova_compute[187252]: 2025-11-28 16:23:50.807 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:51 np0005538960 podman[217459]: 2025-11-28 16:23:51.166268044 +0000 UTC m=+0.071546177 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:23:52 np0005538960 nova_compute[187252]: 2025-11-28 16:23:52.739 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:52 np0005538960 nova_compute[187252]: 2025-11-28 16:23:52.980 187256 DEBUG nova.compute.manager [req-093d2e21-4952-4763-8cf4-0f179a05ec64 req-ae0c59db-271d-4375-b539-bdf320a01354 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-changed-e625ba45-02b0-4430-8aff-d9674489e676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:23:52 np0005538960 nova_compute[187252]: 2025-11-28 16:23:52.981 187256 DEBUG nova.compute.manager [req-093d2e21-4952-4763-8cf4-0f179a05ec64 req-ae0c59db-271d-4375-b539-bdf320a01354 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Refreshing instance network info cache due to event network-changed-e625ba45-02b0-4430-8aff-d9674489e676. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:23:52 np0005538960 nova_compute[187252]: 2025-11-28 16:23:52.982 187256 DEBUG oslo_concurrency.lockutils [req-093d2e21-4952-4763-8cf4-0f179a05ec64 req-ae0c59db-271d-4375-b539-bdf320a01354 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:23:52 np0005538960 nova_compute[187252]: 2025-11-28 16:23:52.982 187256 DEBUG oslo_concurrency.lockutils [req-093d2e21-4952-4763-8cf4-0f179a05ec64 req-ae0c59db-271d-4375-b539-bdf320a01354 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:23:52 np0005538960 nova_compute[187252]: 2025-11-28 16:23:52.982 187256 DEBUG nova.network.neutron [req-093d2e21-4952-4763-8cf4-0f179a05ec64 req-ae0c59db-271d-4375-b539-bdf320a01354 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Refreshing network info cache for port e625ba45-02b0-4430-8aff-d9674489e676 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:23:53 np0005538960 ovn_controller[95460]: 2025-11-28T16:23:53Z|00094|binding|INFO|Releasing lport 59b9a1a2-0895-427c-9467-c27cddf3180e from this chassis (sb_readonly=0)
Nov 28 11:23:53 np0005538960 nova_compute[187252]: 2025-11-28 16:23:53.195 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:54 np0005538960 podman[217479]: 2025-11-28 16:23:54.164472178 +0000 UTC m=+0.062041367 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:23:55 np0005538960 nova_compute[187252]: 2025-11-28 16:23:55.810 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:56 np0005538960 nova_compute[187252]: 2025-11-28 16:23:56.227 187256 DEBUG nova.network.neutron [req-093d2e21-4952-4763-8cf4-0f179a05ec64 req-ae0c59db-271d-4375-b539-bdf320a01354 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updated VIF entry in instance network info cache for port e625ba45-02b0-4430-8aff-d9674489e676. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:23:56 np0005538960 nova_compute[187252]: 2025-11-28 16:23:56.228 187256 DEBUG nova.network.neutron [req-093d2e21-4952-4763-8cf4-0f179a05ec64 req-ae0c59db-271d-4375-b539-bdf320a01354 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updating instance_info_cache with network_info: [{"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:23:56 np0005538960 nova_compute[187252]: 2025-11-28 16:23:56.249 187256 DEBUG oslo_concurrency.lockutils [req-093d2e21-4952-4763-8cf4-0f179a05ec64 req-ae0c59db-271d-4375-b539-bdf320a01354 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:23:57 np0005538960 nova_compute[187252]: 2025-11-28 16:23:57.742 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:23:59 np0005538960 nova_compute[187252]: 2025-11-28 16:23:59.870 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347024.8684871, a59a72c8-b3b5-407f-8a7a-f939a34b5c75 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:23:59 np0005538960 nova_compute[187252]: 2025-11-28 16:23:59.872 187256 INFO nova.compute.manager [-] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:23:59 np0005538960 nova_compute[187252]: 2025-11-28 16:23:59.973 187256 DEBUG nova.compute.manager [None req-9243db2e-07dd-4092-86b3-3c70e47e0d4e - - - - - -] [instance: a59a72c8-b3b5-407f-8a7a-f939a34b5c75] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:24:00 np0005538960 nova_compute[187252]: 2025-11-28 16:24:00.128 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:00 np0005538960 nova_compute[187252]: 2025-11-28 16:24:00.813 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:01 np0005538960 podman[217509]: 2025-11-28 16:24:01.195728847 +0000 UTC m=+0.092934550 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 11:24:02 np0005538960 nova_compute[187252]: 2025-11-28 16:24:02.744 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:04 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:04Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:d5:83 10.100.0.6
Nov 28 11:24:04 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:04Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:d5:83 10.100.0.6
Nov 28 11:24:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:05.052 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:24:05 np0005538960 nova_compute[187252]: 2025-11-28 16:24:05.053 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:05.054 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:24:05 np0005538960 podman[217552]: 2025-11-28 16:24:05.156881298 +0000 UTC m=+0.051771085 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:24:05 np0005538960 podman[217551]: 2025-11-28 16:24:05.158388826 +0000 UTC m=+0.058368476 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:24:05 np0005538960 nova_compute[187252]: 2025-11-28 16:24:05.816 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:06.344 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:06.345 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:06.345 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:07 np0005538960 nova_compute[187252]: 2025-11-28 16:24:07.747 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:10 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:10.056 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:24:10 np0005538960 podman[217587]: 2025-11-28 16:24:10.159198978 +0000 UTC m=+0.058530750 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:24:10 np0005538960 nova_compute[187252]: 2025-11-28 16:24:10.854 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:11 np0005538960 nova_compute[187252]: 2025-11-28 16:24:11.898 187256 INFO nova.compute.manager [None req-994c49a7-aa22-41df-be8c-70f5e9fe9644 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Get console output#033[00m
Nov 28 11:24:11 np0005538960 nova_compute[187252]: 2025-11-28 16:24:11.905 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:24:12 np0005538960 nova_compute[187252]: 2025-11-28 16:24:12.751 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:13 np0005538960 podman[217610]: 2025-11-28 16:24:13.155655349 +0000 UTC m=+0.059300219 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, architecture=x86_64, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:24:13 np0005538960 nova_compute[187252]: 2025-11-28 16:24:13.579 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:15 np0005538960 nova_compute[187252]: 2025-11-28 16:24:15.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:15 np0005538960 nova_compute[187252]: 2025-11-28 16:24:15.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:24:15 np0005538960 nova_compute[187252]: 2025-11-28 16:24:15.857 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:16 np0005538960 nova_compute[187252]: 2025-11-28 16:24:16.734 187256 DEBUG nova.compute.manager [req-b16d22aa-fb9c-49aa-81d2-93327c86f90d req-95b10258-f353-4a44-92d7-b5d4526f860f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-changed-e625ba45-02b0-4430-8aff-d9674489e676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:24:16 np0005538960 nova_compute[187252]: 2025-11-28 16:24:16.734 187256 DEBUG nova.compute.manager [req-b16d22aa-fb9c-49aa-81d2-93327c86f90d req-95b10258-f353-4a44-92d7-b5d4526f860f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Refreshing instance network info cache due to event network-changed-e625ba45-02b0-4430-8aff-d9674489e676. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:24:16 np0005538960 nova_compute[187252]: 2025-11-28 16:24:16.734 187256 DEBUG oslo_concurrency.lockutils [req-b16d22aa-fb9c-49aa-81d2-93327c86f90d req-95b10258-f353-4a44-92d7-b5d4526f860f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:24:16 np0005538960 nova_compute[187252]: 2025-11-28 16:24:16.735 187256 DEBUG oslo_concurrency.lockutils [req-b16d22aa-fb9c-49aa-81d2-93327c86f90d req-95b10258-f353-4a44-92d7-b5d4526f860f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:24:16 np0005538960 nova_compute[187252]: 2025-11-28 16:24:16.735 187256 DEBUG nova.network.neutron [req-b16d22aa-fb9c-49aa-81d2-93327c86f90d req-95b10258-f353-4a44-92d7-b5d4526f860f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Refreshing network info cache for port e625ba45-02b0-4430-8aff-d9674489e676 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.344 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.345 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.345 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.345 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.414 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.487 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.488 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.552 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.735 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.738 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5577MB free_disk=73.31381225585938GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.738 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.738 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.752 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.912 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance cd030167-3f92-4d3e-9ecc-3be8d39ada4f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.913 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.913 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.940 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.970 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.971 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:24:17 np0005538960 nova_compute[187252]: 2025-11-28 16:24:17.990 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:24:18 np0005538960 nova_compute[187252]: 2025-11-28 16:24:18.017 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:24:18 np0005538960 nova_compute[187252]: 2025-11-28 16:24:18.072 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:24:18 np0005538960 nova_compute[187252]: 2025-11-28 16:24:18.099 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:24:18 np0005538960 nova_compute[187252]: 2025-11-28 16:24:18.157 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:24:18 np0005538960 nova_compute[187252]: 2025-11-28 16:24:18.158 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.420s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.159 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.160 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.677 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.713 187256 DEBUG nova.network.neutron [req-b16d22aa-fb9c-49aa-81d2-93327c86f90d req-95b10258-f353-4a44-92d7-b5d4526f860f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updated VIF entry in instance network info cache for port e625ba45-02b0-4430-8aff-d9674489e676. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.714 187256 DEBUG nova.network.neutron [req-b16d22aa-fb9c-49aa-81d2-93327c86f90d req-95b10258-f353-4a44-92d7-b5d4526f860f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updating instance_info_cache with network_info: [{"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.763 187256 DEBUG oslo_concurrency.lockutils [req-b16d22aa-fb9c-49aa-81d2-93327c86f90d req-95b10258-f353-4a44-92d7-b5d4526f860f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.764 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.765 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:24:19 np0005538960 nova_compute[187252]: 2025-11-28 16:24:19.765 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cd030167-3f92-4d3e-9ecc-3be8d39ada4f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:24:20 np0005538960 nova_compute[187252]: 2025-11-28 16:24:20.860 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:22 np0005538960 podman[217640]: 2025-11-28 16:24:22.156079672 +0000 UTC m=+0.061461791 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 11:24:22 np0005538960 nova_compute[187252]: 2025-11-28 16:24:22.755 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:25 np0005538960 podman[217660]: 2025-11-28 16:24:25.187197999 +0000 UTC m=+0.088052331 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:24:25 np0005538960 nova_compute[187252]: 2025-11-28 16:24:25.863 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:27 np0005538960 nova_compute[187252]: 2025-11-28 16:24:27.756 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:28 np0005538960 nova_compute[187252]: 2025-11-28 16:24:28.230 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updating instance_info_cache with network_info: [{"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:24:28 np0005538960 nova_compute[187252]: 2025-11-28 16:24:28.258 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-cd030167-3f92-4d3e-9ecc-3be8d39ada4f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:24:28 np0005538960 nova_compute[187252]: 2025-11-28 16:24:28.258 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:24:28 np0005538960 nova_compute[187252]: 2025-11-28 16:24:28.259 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:28 np0005538960 nova_compute[187252]: 2025-11-28 16:24:28.259 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:28 np0005538960 nova_compute[187252]: 2025-11-28 16:24:28.259 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:29 np0005538960 nova_compute[187252]: 2025-11-28 16:24:29.254 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:24:30 np0005538960 nova_compute[187252]: 2025-11-28 16:24:30.866 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:32 np0005538960 podman[217686]: 2025-11-28 16:24:32.196985915 +0000 UTC m=+0.103135119 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 11:24:32 np0005538960 nova_compute[187252]: 2025-11-28 16:24:32.758 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:34 np0005538960 nova_compute[187252]: 2025-11-28 16:24:34.813 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "8dd47dea-36db-454f-9c3e-db6c599d52c0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:34 np0005538960 nova_compute[187252]: 2025-11-28 16:24:34.813 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:34 np0005538960 nova_compute[187252]: 2025-11-28 16:24:34.832 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:24:34 np0005538960 nova_compute[187252]: 2025-11-28 16:24:34.914 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:34 np0005538960 nova_compute[187252]: 2025-11-28 16:24:34.915 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:34 np0005538960 nova_compute[187252]: 2025-11-28 16:24:34.922 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:24:34 np0005538960 nova_compute[187252]: 2025-11-28 16:24:34.922 187256 INFO nova.compute.claims [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.076 187256 DEBUG nova.compute.provider_tree [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.092 187256 DEBUG nova.scheduler.client.report [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.115 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.201s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.117 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.164 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.165 187256 DEBUG nova.network.neutron [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.182 187256 INFO nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.199 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.289 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.291 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.292 187256 INFO nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Creating image(s)#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.293 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "/var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.294 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.295 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.311 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'name': 'tempest-TestNetworkBasicOps-server-1892971024', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000013', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'user_id': 'a4105532118847f583e4bf7594336693', 'hostId': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.312 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.312 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.312 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1892971024>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1892971024>]
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.313 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.314 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.317 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for cd030167-3f92-4d3e-9ecc-3be8d39ada4f / tape625ba45-02 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.317 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.outgoing.bytes volume: 16084 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a63ab48-aa7f-4b62-a43c-90cde978ba2b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 16084, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.313415', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba5a5c68-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': 'd14476863633e60bce0b604d366811c7b66b67d4a8293cb6151b308c097c3cef'}]}, 'timestamp': '2025-11-28 16:24:35.317927', '_unique_id': '893ff8a3d8734761aaf3e3a56eaa6c64'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.321 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.321 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1892971024>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1892971024>]
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.350 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.write.bytes volume: 72945664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.351 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '766469fa-4fc0-4a43-9d96-5182db99a573', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72945664, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.321637', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba5f7220-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '071c4ef22c8599dc57f5200adc1cee64b713ebf3cd672b86f58cbbcc115ee961'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.321637', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba5f87a6-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '886c19da65f74303861c1129014b65654b144b270c495b49e164e41a258a0cfd'}]}, 'timestamp': '2025-11-28 16:24:35.351706', '_unique_id': 'fc24d061c0e04fae8e348677ed31723f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.353 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.354 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.374 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/memory.usage volume: 42.671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'afb36778-c829-4173-85e2-955473433155', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.671875, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'timestamp': '2025-11-28T16:24:35.354298', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ba630ade-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4175.008066682, 'message_signature': 'b67caddd9ba1bbb18928189224ae53ab343aa65fff193ae9fde51baa59ea3387'}]}, 'timestamp': '2025-11-28 16:24:35.374839', '_unique_id': '682e72a1bc93410b9b0d4b4aa8a68439'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.376 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.377 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.382 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.383 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.384 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.390 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.391 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ed74de3c-f911-4de8-b369-69f494e8aeaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.377813', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba659ba0-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4175.012030479, 'message_signature': '668b6b18e0a3e9322340490eb829a7806e0a89a13d6c35f5216311954180a52f'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.377813', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba65a92e-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4175.012030479, 'message_signature': '5f8395a596abad87a1b01ef6da05fcc3c58a081e715ca1dffd0ac5eddd7b1009'}]}, 'timestamp': '2025-11-28 16:24:35.391878', '_unique_id': '384ed850b1c349408fcb0ec2f8d377c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.393 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.outgoing.packets volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99d56ea3-7144-4631-8a12-d0ad7c8aaf2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 110, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.394138', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba660cf2-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': '0eb50384828662f35f0c0626e27d811c62c75a6c614013aec3108d4ab3efb4f6'}]}, 'timestamp': '2025-11-28 16:24:35.394413', '_unique_id': 'e9d0d72ddeb744c5906273e65d7304fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.395 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.395 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.395 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1892971024>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1892971024>]
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.395 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.395 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.write.latency volume: 24949309646 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.396 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce856cd4-5431-4ea8-9318-ebcdc4d8d493', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 24949309646, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.395891', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba6651da-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '49e071540a6069fd72439db7bc750048a0c0ef63f5da156a1d2e2b97409f3d67'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.395891', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba665a18-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '653081f652cd5db7b3d34cdd04f75149734319d2e2a277e05fade64b356a1c30'}]}, 'timestamp': '2025-11-28 16:24:35.396363', '_unique_id': '3cdaf5bf9d704948b206b32cdc73692e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.397 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '27cbe25d-506f-4863-93fe-a9bee46d1dea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.397970', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba66a1f8-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': 'ab2d0c4060048d77082d29a75a61a98abf1a2701d9c2e1a8ed19fdadf386962d'}]}, 'timestamp': '2025-11-28 16:24:35.398207', '_unique_id': '99475b1629dc44d99f70fe3416c21de1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.399 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.399 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d577872-3877-4846-abf2-9c5fe7c7ab91', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.399469', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba66dcb8-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': '3b0c15d74e4361e3b24222a5102320d39b93efab52fab52d480ccf15ad86c99c'}]}, 'timestamp': '2025-11-28 16:24:35.399741', '_unique_id': '4a516ccd6a9346238c3a558944cebe24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.400 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9aa1105b-4516-4035-977d-642d67869dba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.400925', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba67157a-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': 'e8e62ace9c0fb83c26d30de54734f0abbe94052c0b75f2b5112be32ed20136f1'}]}, 'timestamp': '2025-11-28 16:24:35.401160', '_unique_id': 'a5442d7bba3049fc8714afbe344b660c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.402 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.402 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8d5ecf1-aca5-4f5f-a6b1-ff061080dfdd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.402317', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba674cfc-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': 'e90bc8c14635716ddedadf6197eff283a6f61d2d74b7651f77343a8e1948853e'}]}, 'timestamp': '2025-11-28 16:24:35.402590', '_unique_id': '573a6f0172314eb2a8dc08a30e133c6f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.403 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2dca2c97-cb3c-4636-bf1b-dde2481a1c57', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.403845', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba678884-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': 'd77019f5b99a495a4cb536065d40fe113047fdae2849508de2fcf32a4d5fc244'}]}, 'timestamp': '2025-11-28 16:24:35.404125', '_unique_id': '60f0df77daf34dc5bf6a405b18e8ded7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.404 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.405 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.405 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '355221a5-08a3-4e9f-be54-6edf220cb7a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.405467', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba67c93e-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4175.012030479, 'message_signature': 'df6244a8963a2e1f4fd942a1bc2cd63113675c80f2e69ce6855a091c4da12e70'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.405467', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba67d4c4-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4175.012030479, 'message_signature': 'c6b7a1ce206cf2de96f06b13efbfb179f68a39eda031cce109808e789e8cc29b'}]}, 'timestamp': '2025-11-28 16:24:35.406051', '_unique_id': 'ad7b3b2ecc984df4bb04338aed834305'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.406 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.407 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.407 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.write.requests volume: 309 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.407 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10e9706c-a671-49fc-b40e-7fdb49950272', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 309, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.407263', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba680d40-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '96bb7769411ffc99cbc3d2eb7e2e125ef6d92cbd5734b4372ec9d59a73022a1d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.407263', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba6815ce-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': 'fdd6787ef3a2aa4394b40cebf57de05c1aa6c4faa912d783f2cb5c953cca39c5'}]}, 'timestamp': '2025-11-28 16:24:35.407708', '_unique_id': 'b83f8aa41a2d4d6ea2434e9dfcba065e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.408 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.incoming.bytes volume: 19418 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad65e8a3-f4b1-41f6-8c40-7ade7925bbf1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 19418, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.408962', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba685098-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': '40fb9f28e0d53a86f06beabdaf9aea3eaa0320083c9d5043aa0dd37e03a7d7c9'}]}, 'timestamp': '2025-11-28 16:24:35.409281', '_unique_id': '29290724ef5e4f00bd295b6d6e86e128'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.409 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.410 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.410 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.410 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49aafd6e-c517-4bdf-a775-60405cd109af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.410574', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba688e50-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4175.012030479, 'message_signature': 'd7414850dbbfa39e4cd53f47d7770da17306f1b6c5fa4eea94a7143d11f5b4f2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.410574', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba6896de-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4175.012030479, 'message_signature': '507b9dca484c8bf5748a3c367431210fc93862db742fefa3161afe842f3149b6'}]}, 'timestamp': '2025-11-28 16:24:35.411011', '_unique_id': '64abf5bbde394a948be089bffebee8b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.411 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.412 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.412 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.read.latency volume: 216011874 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.412 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.read.latency volume: 28440026 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ba869531-4c95-458b-aab2-8b879d1b7a5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 216011874, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.412108', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba68ca6e-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '9e53d8609f0779463163d2b13788fc62c57f24eb7063b8c0a3b54ab596b71fdd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 28440026, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.412108', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba68d356-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': 'bfba06569294b1ecfcf86d03a786aad41966161790bbf61d1ccc77c28c3f8614'}]}, 'timestamp': '2025-11-28 16:24:35.412586', '_unique_id': '72771d1d697c4f11be040459806a7421'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.413 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.incoming.packets volume: 107 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ceb8f41e-4ee3-45d4-a7c5-eddee941417d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 107, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.413819', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba690d9e-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': 'f15c84cb3e4faed8cb2a1e84511a2a388252e59778f0699b0173b6d8dc51964b'}]}, 'timestamp': '2025-11-28 16:24:35.414062', '_unique_id': '2ac92d3844d742ac881f1ca3eb5e1e0f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.414 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.415 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.415 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.415 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1892971024>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1892971024>]
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.415 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.415 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1850e5df-7410-4d9d-b418-b441a49890c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-00000013-cd030167-3f92-4d3e-9ecc-3be8d39ada4f-tape625ba45-02', 'timestamp': '2025-11-28T16:24:35.415617', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'tape625ba45-02', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:d5:83', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape625ba45-02'}, 'message_id': 'ba695498-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.947569146, 'message_signature': '6004fcf123442eabfa4e3219968128802d0e2274ee7032a91b26bc23f14facd0'}]}, 'timestamp': '2025-11-28 16:24:35.415941', '_unique_id': '3106d8b4b3bf4da29650d90e25849e5a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.416 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.417 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.417 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.read.requests volume: 1068 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.417 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1cdb45ab-57cb-42df-bfe4-9909874a3b9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1068, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.417441', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba699a8e-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '5c99f396ac258eece73153686dae7e911e736047c6585fc855652ac3de7697c9'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.417441', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba69a27c-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '5bfc4b55040b74a13ef318c5188d5f61289bbcee641c6ee4906061c5003392df'}]}, 'timestamp': '2025-11-28 16:24:35.417858', '_unique_id': 'c11c09f6017a4b67a7bdb69a0d089d53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.418 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.read.bytes volume: 29522432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29f4339b-6a15-4e7b-829c-f9fff3599ece', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 29522432, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-vda', 'timestamp': '2025-11-28T16:24:35.419074', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ba69da44-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '5ca4884da634abcd46d79addb25ed7a2935c74874ec979ec13f1c164f0b5884b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f-sda', 'timestamp': '2025-11-28T16:24:35.419074', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ba69e214-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4174.955826787, 'message_signature': '31356944d4862849dad488e1e09f4122d73e1754f1b42b4d30e45056b4d55ad5'}]}, 'timestamp': '2025-11-28 16:24:35.419486', '_unique_id': 'e9ddd05d2f844634a99afef1f888927c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.420 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.420 12 DEBUG ceilometer.compute.pollsters [-] cd030167-3f92-4d3e-9ecc-3be8d39ada4f/cpu volume: 12890000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4dcfe0a9-994b-4988-9a51-a8550bbb75fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12890000000, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'timestamp': '2025-11-28T16:24:35.420626', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1892971024', 'name': 'instance-00000013', 'instance_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'ba6a16c6-cc76-11f0-bcca-fa163efe7585', 'monotonic_time': 4175.008066682, 'message_signature': '3057565e06bdba1fb99ad7af19426c747401ee6100a5477e064c868e25d55ca0'}]}, 'timestamp': '2025-11-28 16:24:35.420871', '_unique_id': '1de4531e2982407796de2fe6bdb7255d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:24:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:24:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.467 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.468 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.507 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.509 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.510 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.577 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.579 187256 DEBUG nova.virt.disk.api [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Checking if we can resize image /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.579 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.651 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.653 187256 DEBUG nova.virt.disk.api [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Cannot resize image /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.654 187256 DEBUG nova.objects.instance [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'migration_context' on Instance uuid 8dd47dea-36db-454f-9c3e-db6c599d52c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.674 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.675 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Ensure instance console log exists: /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.675 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.676 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.676 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.717 187256 DEBUG nova.policy [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:24:35 np0005538960 nova_compute[187252]: 2025-11-28 16:24:35.870 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:36 np0005538960 podman[217728]: 2025-11-28 16:24:36.243416208 +0000 UTC m=+0.142944501 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:24:36 np0005538960 podman[217727]: 2025-11-28 16:24:36.290129588 +0000 UTC m=+0.062527197 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 11:24:37 np0005538960 nova_compute[187252]: 2025-11-28 16:24:37.760 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:38 np0005538960 nova_compute[187252]: 2025-11-28 16:24:38.176 187256 DEBUG nova.network.neutron [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Successfully created port: b825f862-21ed-468e-b094-9a9b2a09a912 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:24:38 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:38Z|00095|binding|INFO|Releasing lport 59b9a1a2-0895-427c-9467-c27cddf3180e from this chassis (sb_readonly=0)
Nov 28 11:24:38 np0005538960 nova_compute[187252]: 2025-11-28 16:24:38.732 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:38 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:38Z|00096|binding|INFO|Releasing lport 59b9a1a2-0895-427c-9467-c27cddf3180e from this chassis (sb_readonly=0)
Nov 28 11:24:38 np0005538960 nova_compute[187252]: 2025-11-28 16:24:38.892 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:40 np0005538960 nova_compute[187252]: 2025-11-28 16:24:40.542 187256 DEBUG nova.network.neutron [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Successfully updated port: b825f862-21ed-468e-b094-9a9b2a09a912 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:24:40 np0005538960 nova_compute[187252]: 2025-11-28 16:24:40.556 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:24:40 np0005538960 nova_compute[187252]: 2025-11-28 16:24:40.557 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquired lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:24:40 np0005538960 nova_compute[187252]: 2025-11-28 16:24:40.557 187256 DEBUG nova.network.neutron [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:24:40 np0005538960 nova_compute[187252]: 2025-11-28 16:24:40.776 187256 DEBUG nova.network.neutron [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:24:40 np0005538960 nova_compute[187252]: 2025-11-28 16:24:40.922 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:41 np0005538960 podman[217768]: 2025-11-28 16:24:41.196091665 +0000 UTC m=+0.090793608 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.091 187256 DEBUG nova.network.neutron [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Updating instance_info_cache with network_info: [{"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.124 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Releasing lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.124 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Instance network_info: |[{"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.126 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Start _get_guest_xml network_info=[{"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.133 187256 WARNING nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.138 187256 DEBUG nova.virt.libvirt.host [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.139 187256 DEBUG nova.virt.libvirt.host [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.144 187256 DEBUG nova.virt.libvirt.host [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.145 187256 DEBUG nova.virt.libvirt.host [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.147 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.147 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.148 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.148 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.148 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.149 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.149 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.149 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.150 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.150 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.150 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.150 187256 DEBUG nova.virt.hardware [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.155 187256 DEBUG nova.virt.libvirt.vif [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2117221755',display_name='tempest-TestNetworkBasicOps-server-2117221755',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2117221755',id=23,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJhJ2hOHEB931r9C3eSq49Gr7P7HJKZVYcmcXnteaTZ4DABcpyhZlGAYsk7YbdH/OVo45o7K+onP4ieZBitGo89ZJOez4CB2m6HVQCdF4f/g5DUCx3qKyWI8U3hIVx8Xcg==',key_name='tempest-TestNetworkBasicOps-1618915477',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-537zis2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:24:35Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=8dd47dea-36db-454f-9c3e-db6c599d52c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.155 187256 DEBUG nova.network.os_vif_util [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.156 187256 DEBUG nova.network.os_vif_util [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:17:a4,bridge_name='br-int',has_traffic_filtering=True,id=b825f862-21ed-468e-b094-9a9b2a09a912,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825f862-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.158 187256 DEBUG nova.objects.instance [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'pci_devices' on Instance uuid 8dd47dea-36db-454f-9c3e-db6c599d52c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.185 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <uuid>8dd47dea-36db-454f-9c3e-db6c599d52c0</uuid>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <name>instance-00000017</name>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkBasicOps-server-2117221755</nova:name>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:24:42</nova:creationTime>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        <nova:port uuid="b825f862-21ed-468e-b094-9a9b2a09a912">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <entry name="serial">8dd47dea-36db-454f-9c3e-db6c599d52c0</entry>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <entry name="uuid">8dd47dea-36db-454f-9c3e-db6c599d52c0</entry>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk.config"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:c7:17:a4"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <target dev="tapb825f862-21"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/console.log" append="off"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:24:42 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:24:42 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:24:42 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:24:42 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.187 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Preparing to wait for external event network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.187 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.187 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.187 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.188 187256 DEBUG nova.virt.libvirt.vif [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2117221755',display_name='tempest-TestNetworkBasicOps-server-2117221755',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2117221755',id=23,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJhJ2hOHEB931r9C3eSq49Gr7P7HJKZVYcmcXnteaTZ4DABcpyhZlGAYsk7YbdH/OVo45o7K+onP4ieZBitGo89ZJOez4CB2m6HVQCdF4f/g5DUCx3qKyWI8U3hIVx8Xcg==',key_name='tempest-TestNetworkBasicOps-1618915477',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-537zis2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:24:35Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=8dd47dea-36db-454f-9c3e-db6c599d52c0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.189 187256 DEBUG nova.network.os_vif_util [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.189 187256 DEBUG nova.network.os_vif_util [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c7:17:a4,bridge_name='br-int',has_traffic_filtering=True,id=b825f862-21ed-468e-b094-9a9b2a09a912,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825f862-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.190 187256 DEBUG os_vif [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:17:a4,bridge_name='br-int',has_traffic_filtering=True,id=b825f862-21ed-468e-b094-9a9b2a09a912,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825f862-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.190 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.191 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.191 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.196 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.196 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb825f862-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.197 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb825f862-21, col_values=(('external_ids', {'iface-id': 'b825f862-21ed-468e-b094-9a9b2a09a912', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c7:17:a4', 'vm-uuid': '8dd47dea-36db-454f-9c3e-db6c599d52c0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.199 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.201 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:24:42 np0005538960 NetworkManager[55548]: <info>  [1764347082.2008] manager: (tapb825f862-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.209 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.212 187256 INFO os_vif [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c7:17:a4,bridge_name='br-int',has_traffic_filtering=True,id=b825f862-21ed-468e-b094-9a9b2a09a912,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825f862-21')#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.365 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.366 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.366 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No VIF found with MAC fa:16:3e:c7:17:a4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.367 187256 INFO nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Using config drive#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.374 187256 DEBUG nova.compute.manager [req-14a8cd63-82f2-473d-a8ea-7b105fe8c824 req-852d6055-0537-4afb-b1d3-705e3f8bb714 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received event network-changed-b825f862-21ed-468e-b094-9a9b2a09a912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.374 187256 DEBUG nova.compute.manager [req-14a8cd63-82f2-473d-a8ea-7b105fe8c824 req-852d6055-0537-4afb-b1d3-705e3f8bb714 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Refreshing instance network info cache due to event network-changed-b825f862-21ed-468e-b094-9a9b2a09a912. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.375 187256 DEBUG oslo_concurrency.lockutils [req-14a8cd63-82f2-473d-a8ea-7b105fe8c824 req-852d6055-0537-4afb-b1d3-705e3f8bb714 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.375 187256 DEBUG oslo_concurrency.lockutils [req-14a8cd63-82f2-473d-a8ea-7b105fe8c824 req-852d6055-0537-4afb-b1d3-705e3f8bb714 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.375 187256 DEBUG nova.network.neutron [req-14a8cd63-82f2-473d-a8ea-7b105fe8c824 req-852d6055-0537-4afb-b1d3-705e3f8bb714 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Refreshing network info cache for port b825f862-21ed-468e-b094-9a9b2a09a912 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.762 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.848 187256 INFO nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Creating config drive at /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk.config#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.853 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw_sv4j2g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:24:42 np0005538960 nova_compute[187252]: 2025-11-28 16:24:42.985 187256 DEBUG oslo_concurrency.processutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpw_sv4j2g" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:24:43 np0005538960 kernel: tapb825f862-21: entered promiscuous mode
Nov 28 11:24:43 np0005538960 NetworkManager[55548]: <info>  [1764347083.0482] manager: (tapb825f862-21): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Nov 28 11:24:43 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:43Z|00097|binding|INFO|Claiming lport b825f862-21ed-468e-b094-9a9b2a09a912 for this chassis.
Nov 28 11:24:43 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:43Z|00098|binding|INFO|b825f862-21ed-468e-b094-9a9b2a09a912: Claiming fa:16:3e:c7:17:a4 10.100.0.5
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.051 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.060 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:17:a4 10.100.0.5'], port_security=['fa:16:3e:c7:17:a4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '8dd47dea-36db-454f-9c3e-db6c599d52c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57ac1780-c5b4-496c-8b5e-7335798054b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b92dfe7f-c78a-4249-87ce-317bc66b3c6e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4d55fe3-0802-461e-86b4-090e5688fd31, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=b825f862-21ed-468e-b094-9a9b2a09a912) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.062 104369 INFO neutron.agent.ovn.metadata.agent [-] Port b825f862-21ed-468e-b094-9a9b2a09a912 in datapath 57ac1780-c5b4-496c-8b5e-7335798054b3 bound to our chassis#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.063 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 57ac1780-c5b4-496c-8b5e-7335798054b3#033[00m
Nov 28 11:24:43 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:43Z|00099|binding|INFO|Setting lport b825f862-21ed-468e-b094-9a9b2a09a912 ovn-installed in OVS
Nov 28 11:24:43 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:43Z|00100|binding|INFO|Setting lport b825f862-21ed-468e-b094-9a9b2a09a912 up in Southbound
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.071 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:43 np0005538960 systemd-udevd[217813]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.085 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e29bcf08-224f-4735-9a70-619487579161]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:24:43 np0005538960 systemd-machined[153518]: New machine qemu-8-instance-00000017.
Nov 28 11:24:43 np0005538960 NetworkManager[55548]: <info>  [1764347083.1020] device (tapb825f862-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:24:43 np0005538960 NetworkManager[55548]: <info>  [1764347083.1031] device (tapb825f862-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:24:43 np0005538960 systemd[1]: Started Virtual Machine qemu-8-instance-00000017.
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.125 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[73c4c84d-e7f1-45bd-9d48-03cc4211995e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.132 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[ea65e9fd-a388-4991-9091-3b973b31e173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.167 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[44cfaa31-dff5-478d-a2b0-21929c778311]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.191 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a42d34ce-e019-4314-8f08-d13bc60e6c3c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57ac1780-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:05:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412669, 'reachable_time': 19610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217826, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.213 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ffe41a13-445d-41f6-9fc5-a393c3518fcb]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap57ac1780-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412682, 'tstamp': 412682}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217828, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap57ac1780-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412686, 'tstamp': 412686}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217828, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.216 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57ac1780-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.218 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.219 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.220 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57ac1780-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.221 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.221 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap57ac1780-c0, col_values=(('external_ids', {'iface-id': '59b9a1a2-0895-427c-9467-c27cddf3180e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:24:43 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:24:43.221 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.484 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347083.4834733, 8dd47dea-36db-454f-9c3e-db6c599d52c0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.485 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] VM Started (Lifecycle Event)#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.526 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.532 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347083.4843564, 8dd47dea-36db-454f-9c3e-db6c599d52c0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.532 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.558 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.562 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:24:43 np0005538960 nova_compute[187252]: 2025-11-28 16:24:43.579 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:24:44 np0005538960 podman[217836]: 2025-11-28 16:24:44.179411595 +0000 UTC m=+0.078140268 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vcs-type=git, name=ubi9-minimal, architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=edpm, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.520 187256 DEBUG nova.compute.manager [req-22958005-de60-4172-a25a-f2c4e962cd20 req-901c6a76-b245-4484-adab-db3bdfebccb2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received event network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.521 187256 DEBUG oslo_concurrency.lockutils [req-22958005-de60-4172-a25a-f2c4e962cd20 req-901c6a76-b245-4484-adab-db3bdfebccb2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.521 187256 DEBUG oslo_concurrency.lockutils [req-22958005-de60-4172-a25a-f2c4e962cd20 req-901c6a76-b245-4484-adab-db3bdfebccb2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.522 187256 DEBUG oslo_concurrency.lockutils [req-22958005-de60-4172-a25a-f2c4e962cd20 req-901c6a76-b245-4484-adab-db3bdfebccb2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.522 187256 DEBUG nova.compute.manager [req-22958005-de60-4172-a25a-f2c4e962cd20 req-901c6a76-b245-4484-adab-db3bdfebccb2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Processing event network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.524 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.528 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347084.5281878, 8dd47dea-36db-454f-9c3e-db6c599d52c0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.529 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.531 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.537 187256 INFO nova.virt.libvirt.driver [-] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Instance spawned successfully.#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.538 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.565 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.577 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.581 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.582 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.583 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.583 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.584 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.584 187256 DEBUG nova.virt.libvirt.driver [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.658 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.694 187256 INFO nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Took 9.40 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.695 187256 DEBUG nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.778 187256 INFO nova.compute.manager [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Took 9.90 seconds to build instance.#033[00m
Nov 28 11:24:44 np0005538960 nova_compute[187252]: 2025-11-28 16:24:44.799 187256 DEBUG oslo_concurrency.lockutils [None req-112ae682-c384-4ff6-8f6f-8f138ac614a1 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:45 np0005538960 nova_compute[187252]: 2025-11-28 16:24:45.301 187256 DEBUG nova.network.neutron [req-14a8cd63-82f2-473d-a8ea-7b105fe8c824 req-852d6055-0537-4afb-b1d3-705e3f8bb714 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Updated VIF entry in instance network info cache for port b825f862-21ed-468e-b094-9a9b2a09a912. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:24:45 np0005538960 nova_compute[187252]: 2025-11-28 16:24:45.302 187256 DEBUG nova.network.neutron [req-14a8cd63-82f2-473d-a8ea-7b105fe8c824 req-852d6055-0537-4afb-b1d3-705e3f8bb714 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Updating instance_info_cache with network_info: [{"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:24:45 np0005538960 nova_compute[187252]: 2025-11-28 16:24:45.313 187256 DEBUG oslo_concurrency.lockutils [req-14a8cd63-82f2-473d-a8ea-7b105fe8c824 req-852d6055-0537-4afb-b1d3-705e3f8bb714 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:24:46 np0005538960 nova_compute[187252]: 2025-11-28 16:24:46.599 187256 DEBUG nova.compute.manager [req-95d648ee-e1d5-438f-a930-9cb9d899e425 req-5f393a0b-ddbf-463b-89e4-5a0cb32f2141 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received event network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:24:46 np0005538960 nova_compute[187252]: 2025-11-28 16:24:46.599 187256 DEBUG oslo_concurrency.lockutils [req-95d648ee-e1d5-438f-a930-9cb9d899e425 req-5f393a0b-ddbf-463b-89e4-5a0cb32f2141 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:24:46 np0005538960 nova_compute[187252]: 2025-11-28 16:24:46.599 187256 DEBUG oslo_concurrency.lockutils [req-95d648ee-e1d5-438f-a930-9cb9d899e425 req-5f393a0b-ddbf-463b-89e4-5a0cb32f2141 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:24:46 np0005538960 nova_compute[187252]: 2025-11-28 16:24:46.600 187256 DEBUG oslo_concurrency.lockutils [req-95d648ee-e1d5-438f-a930-9cb9d899e425 req-5f393a0b-ddbf-463b-89e4-5a0cb32f2141 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:24:46 np0005538960 nova_compute[187252]: 2025-11-28 16:24:46.600 187256 DEBUG nova.compute.manager [req-95d648ee-e1d5-438f-a930-9cb9d899e425 req-5f393a0b-ddbf-463b-89e4-5a0cb32f2141 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] No waiting events found dispatching network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:24:46 np0005538960 nova_compute[187252]: 2025-11-28 16:24:46.600 187256 WARNING nova.compute.manager [req-95d648ee-e1d5-438f-a930-9cb9d899e425 req-5f393a0b-ddbf-463b-89e4-5a0cb32f2141 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received unexpected event network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:24:47 np0005538960 nova_compute[187252]: 2025-11-28 16:24:47.199 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:47 np0005538960 nova_compute[187252]: 2025-11-28 16:24:47.769 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:49 np0005538960 nova_compute[187252]: 2025-11-28 16:24:49.147 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:49 np0005538960 NetworkManager[55548]: <info>  [1764347089.1483] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Nov 28 11:24:49 np0005538960 NetworkManager[55548]: <info>  [1764347089.1494] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Nov 28 11:24:49 np0005538960 nova_compute[187252]: 2025-11-28 16:24:49.202 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:49 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:49Z|00101|binding|INFO|Releasing lport 59b9a1a2-0895-427c-9467-c27cddf3180e from this chassis (sb_readonly=0)
Nov 28 11:24:49 np0005538960 nova_compute[187252]: 2025-11-28 16:24:49.223 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:49 np0005538960 nova_compute[187252]: 2025-11-28 16:24:49.624 187256 DEBUG nova.compute.manager [req-1cf2d7ea-096e-433d-896f-c20f2d599d6a req-6411a5d8-58ea-4cdf-8431-e6767d47cbf6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received event network-changed-b825f862-21ed-468e-b094-9a9b2a09a912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:24:49 np0005538960 nova_compute[187252]: 2025-11-28 16:24:49.625 187256 DEBUG nova.compute.manager [req-1cf2d7ea-096e-433d-896f-c20f2d599d6a req-6411a5d8-58ea-4cdf-8431-e6767d47cbf6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Refreshing instance network info cache due to event network-changed-b825f862-21ed-468e-b094-9a9b2a09a912. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:24:49 np0005538960 nova_compute[187252]: 2025-11-28 16:24:49.625 187256 DEBUG oslo_concurrency.lockutils [req-1cf2d7ea-096e-433d-896f-c20f2d599d6a req-6411a5d8-58ea-4cdf-8431-e6767d47cbf6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:24:49 np0005538960 nova_compute[187252]: 2025-11-28 16:24:49.625 187256 DEBUG oslo_concurrency.lockutils [req-1cf2d7ea-096e-433d-896f-c20f2d599d6a req-6411a5d8-58ea-4cdf-8431-e6767d47cbf6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:24:49 np0005538960 nova_compute[187252]: 2025-11-28 16:24:49.626 187256 DEBUG nova.network.neutron [req-1cf2d7ea-096e-433d-896f-c20f2d599d6a req-6411a5d8-58ea-4cdf-8431-e6767d47cbf6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Refreshing network info cache for port b825f862-21ed-468e-b094-9a9b2a09a912 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:24:52 np0005538960 nova_compute[187252]: 2025-11-28 16:24:52.202 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:52 np0005538960 nova_compute[187252]: 2025-11-28 16:24:52.838 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:52 np0005538960 nova_compute[187252]: 2025-11-28 16:24:52.972 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:53 np0005538960 podman[217862]: 2025-11-28 16:24:53.189489914 +0000 UTC m=+0.082457514 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:24:54 np0005538960 nova_compute[187252]: 2025-11-28 16:24:54.164 187256 DEBUG nova.network.neutron [req-1cf2d7ea-096e-433d-896f-c20f2d599d6a req-6411a5d8-58ea-4cdf-8431-e6767d47cbf6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Updated VIF entry in instance network info cache for port b825f862-21ed-468e-b094-9a9b2a09a912. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:24:54 np0005538960 nova_compute[187252]: 2025-11-28 16:24:54.165 187256 DEBUG nova.network.neutron [req-1cf2d7ea-096e-433d-896f-c20f2d599d6a req-6411a5d8-58ea-4cdf-8431-e6767d47cbf6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Updating instance_info_cache with network_info: [{"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:24:54 np0005538960 nova_compute[187252]: 2025-11-28 16:24:54.186 187256 DEBUG oslo_concurrency.lockutils [req-1cf2d7ea-096e-433d-896f-c20f2d599d6a req-6411a5d8-58ea-4cdf-8431-e6767d47cbf6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-8dd47dea-36db-454f-9c3e-db6c599d52c0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:24:56 np0005538960 podman[217882]: 2025-11-28 16:24:56.159611961 +0000 UTC m=+0.064595787 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:24:57 np0005538960 nova_compute[187252]: 2025-11-28 16:24:57.205 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:57 np0005538960 nova_compute[187252]: 2025-11-28 16:24:57.841 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:24:59 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:59Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c7:17:a4 10.100.0.5
Nov 28 11:24:59 np0005538960 ovn_controller[95460]: 2025-11-28T16:24:59Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c7:17:a4 10.100.0.5
Nov 28 11:25:01 np0005538960 nova_compute[187252]: 2025-11-28 16:25:01.014 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:02 np0005538960 nova_compute[187252]: 2025-11-28 16:25:02.208 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:02 np0005538960 nova_compute[187252]: 2025-11-28 16:25:02.844 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:03 np0005538960 podman[217920]: 2025-11-28 16:25:03.259817475 +0000 UTC m=+0.161319053 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:25:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:05.447 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:25:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:05.448 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:25:05 np0005538960 nova_compute[187252]: 2025-11-28 16:25:05.453 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.092 187256 INFO nova.compute.manager [None req-42177e27-8515-41d7-9e20-bda97dc107ee a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Get console output#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.098 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.345 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.346 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.346 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.433 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "8dd47dea-36db-454f-9c3e-db6c599d52c0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.434 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.434 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.435 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.435 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.436 187256 INFO nova.compute.manager [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Terminating instance#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.437 187256 DEBUG nova.compute.manager [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:25:06 np0005538960 kernel: tapb825f862-21 (unregistering): left promiscuous mode
Nov 28 11:25:06 np0005538960 NetworkManager[55548]: <info>  [1764347106.4698] device (tapb825f862-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.485 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:06Z|00102|binding|INFO|Releasing lport b825f862-21ed-468e-b094-9a9b2a09a912 from this chassis (sb_readonly=0)
Nov 28 11:25:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:06Z|00103|binding|INFO|Setting lport b825f862-21ed-468e-b094-9a9b2a09a912 down in Southbound
Nov 28 11:25:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:06Z|00104|binding|INFO|Removing iface tapb825f862-21 ovn-installed in OVS
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.487 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.493 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c7:17:a4 10.100.0.5'], port_security=['fa:16:3e:c7:17:a4 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '8dd47dea-36db-454f-9c3e-db6c599d52c0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57ac1780-c5b4-496c-8b5e-7335798054b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b92dfe7f-c78a-4249-87ce-317bc66b3c6e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.189'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4d55fe3-0802-461e-86b4-090e5688fd31, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=b825f862-21ed-468e-b094-9a9b2a09a912) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.495 104369 INFO neutron.agent.ovn.metadata.agent [-] Port b825f862-21ed-468e-b094-9a9b2a09a912 in datapath 57ac1780-c5b4-496c-8b5e-7335798054b3 unbound from our chassis#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.497 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 57ac1780-c5b4-496c-8b5e-7335798054b3#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.515 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1b54d033-9e6b-416d-97e9-b4b3103dcf7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.550 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.576 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5e30f9-ba1d-4663-9c0a-e508b986870f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.580 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[893a3e3f-0832-4ed8-b839-f976b21ec366]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:06 np0005538960 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000017.scope: Deactivated successfully.
Nov 28 11:25:06 np0005538960 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000017.scope: Consumed 14.312s CPU time.
Nov 28 11:25:06 np0005538960 systemd-machined[153518]: Machine qemu-8-instance-00000017 terminated.
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.612 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[45a62a06-ece7-4ef0-b8b7-5ee28dabb761]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:06 np0005538960 podman[217960]: 2025-11-28 16:25:06.621049639 +0000 UTC m=+0.062241217 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:25:06 np0005538960 podman[217946]: 2025-11-28 16:25:06.63014653 +0000 UTC m=+0.117792921 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.635 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1c2e48f9-009c-4ca3-b681-1f8d65612628]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap57ac1780-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b5:05:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412669, 'reachable_time': 19610, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217994, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.657 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b681b6ee-d4fa-468e-9e17-f34ca783bc1e]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap57ac1780-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412682, 'tstamp': 412682}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217995, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap57ac1780-c1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412686, 'tstamp': 412686}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217995, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.659 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57ac1780-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.660 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.666 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.666 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57ac1780-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.667 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.667 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap57ac1780-c0, col_values=(('external_ids', {'iface-id': '59b9a1a2-0895-427c-9467-c27cddf3180e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:06.667 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.706 187256 INFO nova.virt.libvirt.driver [-] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Instance destroyed successfully.#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.707 187256 DEBUG nova.objects.instance [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'resources' on Instance uuid 8dd47dea-36db-454f-9c3e-db6c599d52c0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.720 187256 DEBUG nova.virt.libvirt.vif [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:24:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-2117221755',display_name='tempest-TestNetworkBasicOps-server-2117221755',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-2117221755',id=23,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJhJ2hOHEB931r9C3eSq49Gr7P7HJKZVYcmcXnteaTZ4DABcpyhZlGAYsk7YbdH/OVo45o7K+onP4ieZBitGo89ZJOez4CB2m6HVQCdF4f/g5DUCx3qKyWI8U3hIVx8Xcg==',key_name='tempest-TestNetworkBasicOps-1618915477',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:24:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-537zis2l',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:24:44Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=8dd47dea-36db-454f-9c3e-db6c599d52c0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.720 187256 DEBUG nova.network.os_vif_util [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "b825f862-21ed-468e-b094-9a9b2a09a912", "address": "fa:16:3e:c7:17:a4", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.189", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb825f862-21", "ovs_interfaceid": "b825f862-21ed-468e-b094-9a9b2a09a912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.722 187256 DEBUG nova.network.os_vif_util [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c7:17:a4,bridge_name='br-int',has_traffic_filtering=True,id=b825f862-21ed-468e-b094-9a9b2a09a912,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825f862-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.722 187256 DEBUG os_vif [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:17:a4,bridge_name='br-int',has_traffic_filtering=True,id=b825f862-21ed-468e-b094-9a9b2a09a912,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825f862-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.726 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.726 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb825f862-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.728 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.731 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.734 187256 INFO os_vif [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c7:17:a4,bridge_name='br-int',has_traffic_filtering=True,id=b825f862-21ed-468e-b094-9a9b2a09a912,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb825f862-21')#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.734 187256 INFO nova.virt.libvirt.driver [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Deleting instance files /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0_del#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.736 187256 INFO nova.virt.libvirt.driver [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Deletion of /var/lib/nova/instances/8dd47dea-36db-454f-9c3e-db6c599d52c0_del complete#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.780 187256 INFO nova.compute.manager [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.781 187256 DEBUG oslo.service.loopingcall [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.781 187256 DEBUG nova.compute.manager [-] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:25:06 np0005538960 nova_compute[187252]: 2025-11-28 16:25:06.781 187256 DEBUG nova.network.neutron [-] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:25:07 np0005538960 nova_compute[187252]: 2025-11-28 16:25:07.846 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.875 187256 DEBUG nova.network.neutron [-] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.894 187256 INFO nova.compute.manager [-] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Took 3.11 seconds to deallocate network for instance.#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.915 187256 DEBUG nova.compute.manager [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received event network-vif-unplugged-b825f862-21ed-468e-b094-9a9b2a09a912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.915 187256 DEBUG oslo_concurrency.lockutils [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.915 187256 DEBUG oslo_concurrency.lockutils [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.916 187256 DEBUG oslo_concurrency.lockutils [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.916 187256 DEBUG nova.compute.manager [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] No waiting events found dispatching network-vif-unplugged-b825f862-21ed-468e-b094-9a9b2a09a912 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.916 187256 DEBUG nova.compute.manager [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received event network-vif-unplugged-b825f862-21ed-468e-b094-9a9b2a09a912 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.916 187256 DEBUG nova.compute.manager [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received event network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.917 187256 DEBUG oslo_concurrency.lockutils [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.917 187256 DEBUG oslo_concurrency.lockutils [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.917 187256 DEBUG oslo_concurrency.lockutils [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.917 187256 DEBUG nova.compute.manager [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] No waiting events found dispatching network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.918 187256 WARNING nova.compute.manager [req-f687a511-755e-47b0-a57d-2f80ab673c1b req-76ac41e0-5b4e-40d6-875d-b44f7091a358 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received unexpected event network-vif-plugged-b825f862-21ed-468e-b094-9a9b2a09a912 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.953 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:09 np0005538960 nova_compute[187252]: 2025-11-28 16:25:09.953 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:10 np0005538960 nova_compute[187252]: 2025-11-28 16:25:10.062 187256 DEBUG nova.compute.provider_tree [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:25:10 np0005538960 nova_compute[187252]: 2025-11-28 16:25:10.077 187256 DEBUG nova.scheduler.client.report [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:25:10 np0005538960 nova_compute[187252]: 2025-11-28 16:25:10.110 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:10 np0005538960 nova_compute[187252]: 2025-11-28 16:25:10.140 187256 INFO nova.scheduler.client.report [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Deleted allocations for instance 8dd47dea-36db-454f-9c3e-db6c599d52c0#033[00m
Nov 28 11:25:10 np0005538960 nova_compute[187252]: 2025-11-28 16:25:10.236 187256 DEBUG oslo_concurrency.lockutils [None req-1b86a7dc-875e-4c13-b124-9aac6061f91d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "8dd47dea-36db-454f-9c3e-db6c599d52c0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:10 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:10.450 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:11 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:11Z|00105|binding|INFO|Releasing lport 59b9a1a2-0895-427c-9467-c27cddf3180e from this chassis (sb_readonly=0)
Nov 28 11:25:11 np0005538960 nova_compute[187252]: 2025-11-28 16:25:11.610 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:11 np0005538960 nova_compute[187252]: 2025-11-28 16:25:11.729 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:11 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:11Z|00106|binding|INFO|Releasing lport 59b9a1a2-0895-427c-9467-c27cddf3180e from this chassis (sb_readonly=0)
Nov 28 11:25:11 np0005538960 nova_compute[187252]: 2025-11-28 16:25:11.763 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:12 np0005538960 nova_compute[187252]: 2025-11-28 16:25:12.036 187256 DEBUG nova.compute.manager [req-22e3633b-5f73-4899-8c97-58d5ea2c2eb6 req-c22ce916-3e2d-474f-a5d1-ff5a3167f92b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Received event network-vif-deleted-b825f862-21ed-468e-b094-9a9b2a09a912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:12 np0005538960 podman[218030]: 2025-11-28 16:25:12.173699898 +0000 UTC m=+0.064336829 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:25:12 np0005538960 nova_compute[187252]: 2025-11-28 16:25:12.847 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:12.999 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.000 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.000 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.000 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.000 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.001 187256 INFO nova.compute.manager [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Terminating instance#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.002 187256 DEBUG nova.compute.manager [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:25:13 np0005538960 kernel: tape625ba45-02 (unregistering): left promiscuous mode
Nov 28 11:25:13 np0005538960 NetworkManager[55548]: <info>  [1764347113.1544] device (tape625ba45-02): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:25:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:13Z|00107|binding|INFO|Releasing lport e625ba45-02b0-4430-8aff-d9674489e676 from this chassis (sb_readonly=0)
Nov 28 11:25:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:13Z|00108|binding|INFO|Setting lport e625ba45-02b0-4430-8aff-d9674489e676 down in Southbound
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.200 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:13Z|00109|binding|INFO|Removing iface tape625ba45-02 ovn-installed in OVS
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.202 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:13Z|00110|binding|INFO|Releasing lport 59b9a1a2-0895-427c-9467-c27cddf3180e from this chassis (sb_readonly=0)
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.215 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.217 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:d5:83 10.100.0.6'], port_security=['fa:16:3e:0e:d5:83 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'cd030167-3f92-4d3e-9ecc-3be8d39ada4f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-57ac1780-c5b4-496c-8b5e-7335798054b3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd2079cea-1e9f-41b5-816a-6c797998eaed', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b4d55fe3-0802-461e-86b4-090e5688fd31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=e625ba45-02b0-4430-8aff-d9674489e676) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.218 104369 INFO neutron.agent.ovn.metadata.agent [-] Port e625ba45-02b0-4430-8aff-d9674489e676 in datapath 57ac1780-c5b4-496c-8b5e-7335798054b3 unbound from our chassis#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.219 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 57ac1780-c5b4-496c-8b5e-7335798054b3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.221 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea006cf-0db3-442a-bed2-897df278ec91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.222 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3 namespace which is not needed anymore#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.258 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:13 np0005538960 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000013.scope: Deactivated successfully.
Nov 28 11:25:13 np0005538960 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000013.scope: Consumed 17.151s CPU time.
Nov 28 11:25:13 np0005538960 systemd-machined[153518]: Machine qemu-7-instance-00000013 terminated.
Nov 28 11:25:13 np0005538960 neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3[217444]: [NOTICE]   (217448) : haproxy version is 2.8.14-c23fe91
Nov 28 11:25:13 np0005538960 neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3[217444]: [NOTICE]   (217448) : path to executable is /usr/sbin/haproxy
Nov 28 11:25:13 np0005538960 neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3[217444]: [WARNING]  (217448) : Exiting Master process...
Nov 28 11:25:13 np0005538960 neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3[217444]: [ALERT]    (217448) : Current worker (217450) exited with code 143 (Terminated)
Nov 28 11:25:13 np0005538960 neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3[217444]: [WARNING]  (217448) : All workers exited. Exiting... (0)
Nov 28 11:25:13 np0005538960 systemd[1]: libpod-ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276.scope: Deactivated successfully.
Nov 28 11:25:13 np0005538960 conmon[217444]: conmon ea31728327cb199f4138 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276.scope/container/memory.events
Nov 28 11:25:13 np0005538960 podman[218078]: 2025-11-28 16:25:13.459795639 +0000 UTC m=+0.136213600 container died ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.485 187256 INFO nova.virt.libvirt.driver [-] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Instance destroyed successfully.#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.486 187256 DEBUG nova.objects.instance [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'resources' on Instance uuid cd030167-3f92-4d3e-9ecc-3be8d39ada4f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:25:13 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276-userdata-shm.mount: Deactivated successfully.
Nov 28 11:25:13 np0005538960 systemd[1]: var-lib-containers-storage-overlay-ea4f1003ea2ce1e09abbc02b09010d4aacd02f5aecd4ca1c88dc24de76f392df-merged.mount: Deactivated successfully.
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.504 187256 DEBUG nova.virt.libvirt.vif [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:23:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1892971024',display_name='tempest-TestNetworkBasicOps-server-1892971024',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1892971024',id=19,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG/7odxlfRfrLDu3Z3S+r1CDscLCljTk3aaVGgDh0PB+Og4NIJJkQsoCw6yTulRPk+DuAcq538jfu8LteEqYkv00icBqxxc1fJLJW3MCC/5DqlVFaqPgNv7nxJX8KuZdQg==',key_name='tempest-TestNetworkBasicOps-709816810',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:23:48Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-dlzv8dgb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:23:48Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=cd030167-3f92-4d3e-9ecc-3be8d39ada4f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.505 187256 DEBUG nova.network.os_vif_util [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "e625ba45-02b0-4430-8aff-d9674489e676", "address": "fa:16:3e:0e:d5:83", "network": {"id": "57ac1780-c5b4-496c-8b5e-7335798054b3", "bridge": "br-int", "label": "tempest-network-smoke--167795911", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape625ba45-02", "ovs_interfaceid": "e625ba45-02b0-4430-8aff-d9674489e676", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.507 187256 DEBUG nova.network.os_vif_util [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0e:d5:83,bridge_name='br-int',has_traffic_filtering=True,id=e625ba45-02b0-4430-8aff-d9674489e676,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape625ba45-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.508 187256 DEBUG os_vif [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:d5:83,bridge_name='br-int',has_traffic_filtering=True,id=e625ba45-02b0-4430-8aff-d9674489e676,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape625ba45-02') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.510 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.510 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape625ba45-02, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.515 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.518 187256 INFO os_vif [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:d5:83,bridge_name='br-int',has_traffic_filtering=True,id=e625ba45-02b0-4430-8aff-d9674489e676,network=Network(57ac1780-c5b4-496c-8b5e-7335798054b3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape625ba45-02')#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.520 187256 INFO nova.virt.libvirt.driver [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Deleting instance files /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f_del#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.521 187256 INFO nova.virt.libvirt.driver [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Deletion of /var/lib/nova/instances/cd030167-3f92-4d3e-9ecc-3be8d39ada4f_del complete#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.577 187256 INFO nova.compute.manager [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Took 0.57 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.578 187256 DEBUG oslo.service.loopingcall [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.579 187256 DEBUG nova.compute.manager [-] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.579 187256 DEBUG nova.network.neutron [-] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:25:13 np0005538960 podman[218078]: 2025-11-28 16:25:13.595135238 +0000 UTC m=+0.271553179 container cleanup ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:25:13 np0005538960 systemd[1]: libpod-conmon-ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276.scope: Deactivated successfully.
Nov 28 11:25:13 np0005538960 podman[218122]: 2025-11-28 16:25:13.670335761 +0000 UTC m=+0.047758305 container remove ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.676 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c4a8e7cb-3da2-4eb9-8ac8-a048279371a6]: (4, ('Fri Nov 28 04:25:13 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3 (ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276)\nea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276\nFri Nov 28 04:25:13 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3 (ea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276)\nea31728327cb199f4138da477a64686e4ef54a7dff5847168929c5d77195f276\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.678 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ad6f163d-bc84-48c3-831b-71f57dfd8060]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.680 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57ac1780-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.681 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:13 np0005538960 kernel: tap57ac1780-c0: left promiscuous mode
Nov 28 11:25:13 np0005538960 nova_compute[187252]: 2025-11-28 16:25:13.693 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.697 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e94c53-7fbb-4608-8dcf-f0de38478f2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.715 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[cc4db6c3-0e4a-4c18-a861-cf1c267a552f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.716 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9a2dff93-3052-4ca6-a035-3609cbe5a55f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.731 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c55da30e-fd25-43fa-8568-679d2e92e4ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412663, 'reachable_time': 17158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218137, 'error': None, 'target': 'ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.734 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-57ac1780-c5b4-496c-8b5e-7335798054b3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:25:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:13.734 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8583f1-6d69-4776-98b2-a8ed68f35e49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:13 np0005538960 systemd[1]: run-netns-ovnmeta\x2d57ac1780\x2dc5b4\x2d496c\x2d8b5e\x2d7335798054b3.mount: Deactivated successfully.
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.235 187256 DEBUG nova.network.neutron [-] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.254 187256 INFO nova.compute.manager [-] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Took 0.68 seconds to deallocate network for instance.#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.276 187256 DEBUG nova.compute.manager [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-vif-unplugged-e625ba45-02b0-4430-8aff-d9674489e676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.276 187256 DEBUG oslo_concurrency.lockutils [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.276 187256 DEBUG oslo_concurrency.lockutils [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.277 187256 DEBUG oslo_concurrency.lockutils [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.277 187256 DEBUG nova.compute.manager [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] No waiting events found dispatching network-vif-unplugged-e625ba45-02b0-4430-8aff-d9674489e676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.277 187256 DEBUG nova.compute.manager [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-vif-unplugged-e625ba45-02b0-4430-8aff-d9674489e676 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.277 187256 DEBUG nova.compute.manager [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.277 187256 DEBUG oslo_concurrency.lockutils [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.278 187256 DEBUG oslo_concurrency.lockutils [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.278 187256 DEBUG oslo_concurrency.lockutils [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.278 187256 DEBUG nova.compute.manager [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] No waiting events found dispatching network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.278 187256 WARNING nova.compute.manager [req-c68469d0-18f1-4366-b818-f3357409abb6 req-20be325d-c803-426c-b20f-1d079d384924 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received unexpected event network-vif-plugged-e625ba45-02b0-4430-8aff-d9674489e676 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.312 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.312 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.362 187256 DEBUG nova.compute.provider_tree [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.373 187256 DEBUG nova.scheduler.client.report [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.389 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.077s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.427 187256 INFO nova.scheduler.client.report [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Deleted allocations for instance cd030167-3f92-4d3e-9ecc-3be8d39ada4f#033[00m
Nov 28 11:25:14 np0005538960 nova_compute[187252]: 2025-11-28 16:25:14.491 187256 DEBUG oslo_concurrency.lockutils [None req-93895a5b-c118-426a-ba3f-525e2f91b037 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "cd030167-3f92-4d3e-9ecc-3be8d39ada4f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.492s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:15 np0005538960 podman[218138]: 2025-11-28 16:25:15.194292159 +0000 UTC m=+0.091001679 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git)
Nov 28 11:25:16 np0005538960 nova_compute[187252]: 2025-11-28 16:25:16.382 187256 DEBUG nova.compute.manager [req-3ac46a7d-bd44-4b81-9d39-831fedcc83fd req-ce5b0751-ec88-462f-98a5-3bff8f0204fa 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Received event network-vif-deleted-e625ba45-02b0-4430-8aff-d9674489e676 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:17 np0005538960 nova_compute[187252]: 2025-11-28 16:25:17.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:17 np0005538960 nova_compute[187252]: 2025-11-28 16:25:17.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:25:17 np0005538960 nova_compute[187252]: 2025-11-28 16:25:17.849 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:18 np0005538960 nova_compute[187252]: 2025-11-28 16:25:18.513 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.337 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.339 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.522 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.523 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5766MB free_disk=73.3424072265625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.524 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.524 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.604 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.604 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.628 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.643 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.663 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:25:19 np0005538960 nova_compute[187252]: 2025-11-28 16:25:19.664 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:20 np0005538960 nova_compute[187252]: 2025-11-28 16:25:20.663 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:20 np0005538960 nova_compute[187252]: 2025-11-28 16:25:20.664 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:25:20 np0005538960 nova_compute[187252]: 2025-11-28 16:25:20.664 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:25:20 np0005538960 nova_compute[187252]: 2025-11-28 16:25:20.676 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:25:20 np0005538960 nova_compute[187252]: 2025-11-28 16:25:20.677 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:20 np0005538960 nova_compute[187252]: 2025-11-28 16:25:20.677 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:20 np0005538960 nova_compute[187252]: 2025-11-28 16:25:20.678 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:21 np0005538960 nova_compute[187252]: 2025-11-28 16:25:21.704 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347106.7030973, 8dd47dea-36db-454f-9c3e-db6c599d52c0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:25:21 np0005538960 nova_compute[187252]: 2025-11-28 16:25:21.704 187256 INFO nova.compute.manager [-] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:25:21 np0005538960 nova_compute[187252]: 2025-11-28 16:25:21.729 187256 DEBUG nova.compute.manager [None req-06555b1a-8317-49b0-bf00-cad238dc9a89 - - - - - -] [instance: 8dd47dea-36db-454f-9c3e-db6c599d52c0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:25:22 np0005538960 nova_compute[187252]: 2025-11-28 16:25:22.324 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:22 np0005538960 nova_compute[187252]: 2025-11-28 16:25:22.864 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:23 np0005538960 nova_compute[187252]: 2025-11-28 16:25:23.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:23 np0005538960 nova_compute[187252]: 2025-11-28 16:25:23.515 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:24 np0005538960 podman[218161]: 2025-11-28 16:25:24.154649416 +0000 UTC m=+0.062547606 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:25:25 np0005538960 nova_compute[187252]: 2025-11-28 16:25:25.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:25:27 np0005538960 podman[218182]: 2025-11-28 16:25:27.189529746 +0000 UTC m=+0.085591587 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:25:27 np0005538960 nova_compute[187252]: 2025-11-28 16:25:27.866 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:28 np0005538960 nova_compute[187252]: 2025-11-28 16:25:28.482 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347113.481391, cd030167-3f92-4d3e-9ecc-3be8d39ada4f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:25:28 np0005538960 nova_compute[187252]: 2025-11-28 16:25:28.482 187256 INFO nova.compute.manager [-] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:25:28 np0005538960 nova_compute[187252]: 2025-11-28 16:25:28.515 187256 DEBUG nova.compute.manager [None req-9bd3b13f-7a12-42f3-8ba7-8545d1187ad8 - - - - - -] [instance: cd030167-3f92-4d3e-9ecc-3be8d39ada4f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:25:28 np0005538960 nova_compute[187252]: 2025-11-28 16:25:28.517 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:32 np0005538960 nova_compute[187252]: 2025-11-28 16:25:32.868 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:33 np0005538960 nova_compute[187252]: 2025-11-28 16:25:33.519 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:34 np0005538960 podman[218206]: 2025-11-28 16:25:34.202875752 +0000 UTC m=+0.098006039 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 11:25:37 np0005538960 podman[218230]: 2025-11-28 16:25:37.16298736 +0000 UTC m=+0.059803708 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:25:37 np0005538960 podman[218229]: 2025-11-28 16:25:37.16376455 +0000 UTC m=+0.065329244 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 28 11:25:37 np0005538960 nova_compute[187252]: 2025-11-28 16:25:37.870 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:38 np0005538960 nova_compute[187252]: 2025-11-28 16:25:38.521 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:42 np0005538960 podman[218268]: 2025-11-28 16:25:42.855242572 +0000 UTC m=+0.063595990 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:25:42 np0005538960 nova_compute[187252]: 2025-11-28 16:25:42.872 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:43 np0005538960 nova_compute[187252]: 2025-11-28 16:25:43.526 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.280 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.281 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.318 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.436 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.437 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.444 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.445 187256 INFO nova.compute.claims [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.577 187256 DEBUG nova.compute.provider_tree [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.598 187256 DEBUG nova.scheduler.client.report [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.629 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.192s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.630 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.683 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.684 187256 DEBUG nova.network.neutron [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.704 187256 INFO nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.722 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.810 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.811 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.811 187256 INFO nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Creating image(s)#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.812 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.812 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.813 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.825 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.893 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.895 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.896 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.908 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.968 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:25:44 np0005538960 nova_compute[187252]: 2025-11-28 16:25:44.969 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.013 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.014 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.015 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.080 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.082 187256 DEBUG nova.virt.disk.api [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Checking if we can resize image /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.082 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.145 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.147 187256 DEBUG nova.virt.disk.api [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Cannot resize image /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.147 187256 DEBUG nova.objects.instance [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'migration_context' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.165 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.166 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Ensure instance console log exists: /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.167 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.167 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.168 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:45 np0005538960 nova_compute[187252]: 2025-11-28 16:25:45.206 187256 DEBUG nova.policy [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:25:46 np0005538960 podman[218307]: 2025-11-28 16:25:46.156982306 +0000 UTC m=+0.058921627 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 11:25:46 np0005538960 nova_compute[187252]: 2025-11-28 16:25:46.924 187256 DEBUG nova.network.neutron [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Successfully created port: ccc62a21-60d5-4151-8ab5-c33149100cd0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:25:47 np0005538960 nova_compute[187252]: 2025-11-28 16:25:47.880 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:48 np0005538960 nova_compute[187252]: 2025-11-28 16:25:48.529 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:49 np0005538960 nova_compute[187252]: 2025-11-28 16:25:49.083 187256 DEBUG nova.network.neutron [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Successfully updated port: ccc62a21-60d5-4151-8ab5-c33149100cd0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:25:49 np0005538960 nova_compute[187252]: 2025-11-28 16:25:49.120 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:25:49 np0005538960 nova_compute[187252]: 2025-11-28 16:25:49.120 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:25:49 np0005538960 nova_compute[187252]: 2025-11-28 16:25:49.121 187256 DEBUG nova.network.neutron [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:25:49 np0005538960 nova_compute[187252]: 2025-11-28 16:25:49.195 187256 DEBUG nova.compute.manager [req-2905125d-2f46-4f24-9473-6ceb4ce776f7 req-9d2613ff-49d3-4c72-a3cf-b945128dd04c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-changed-ccc62a21-60d5-4151-8ab5-c33149100cd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:49 np0005538960 nova_compute[187252]: 2025-11-28 16:25:49.196 187256 DEBUG nova.compute.manager [req-2905125d-2f46-4f24-9473-6ceb4ce776f7 req-9d2613ff-49d3-4c72-a3cf-b945128dd04c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing instance network info cache due to event network-changed-ccc62a21-60d5-4151-8ab5-c33149100cd0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:25:49 np0005538960 nova_compute[187252]: 2025-11-28 16:25:49.196 187256 DEBUG oslo_concurrency.lockutils [req-2905125d-2f46-4f24-9473-6ceb4ce776f7 req-9d2613ff-49d3-4c72-a3cf-b945128dd04c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:25:49 np0005538960 nova_compute[187252]: 2025-11-28 16:25:49.324 187256 DEBUG nova.network.neutron [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.287 187256 DEBUG nova.network.neutron [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.304 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.305 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Instance network_info: |[{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.305 187256 DEBUG oslo_concurrency.lockutils [req-2905125d-2f46-4f24-9473-6ceb4ce776f7 req-9d2613ff-49d3-4c72-a3cf-b945128dd04c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.305 187256 DEBUG nova.network.neutron [req-2905125d-2f46-4f24-9473-6ceb4ce776f7 req-9d2613ff-49d3-4c72-a3cf-b945128dd04c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing network info cache for port ccc62a21-60d5-4151-8ab5-c33149100cd0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.308 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Start _get_guest_xml network_info=[{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.313 187256 WARNING nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.316 187256 DEBUG nova.virt.libvirt.host [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.317 187256 DEBUG nova.virt.libvirt.host [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.322 187256 DEBUG nova.virt.libvirt.host [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.322 187256 DEBUG nova.virt.libvirt.host [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.323 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.323 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.324 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.324 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.324 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.324 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.325 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.325 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.325 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.325 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.325 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.326 187256 DEBUG nova.virt.hardware [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.329 187256 DEBUG nova.virt.libvirt.vif [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:25:44Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.329 187256 DEBUG nova.network.os_vif_util [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.329 187256 DEBUG nova.network.os_vif_util [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:d6:a7,bridge_name='br-int',has_traffic_filtering=True,id=ccc62a21-60d5-4151-8ab5-c33149100cd0,network=Network(577c0581-66ed-41fb-8a29-0e25a0007ac2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccc62a21-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.330 187256 DEBUG nova.objects.instance [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'pci_devices' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.350 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <uuid>ecbea330-ccac-4a01-a80b-0c10a2f686e2</uuid>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <name>instance-0000001a</name>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkBasicOps-server-1122100469</nova:name>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:25:50</nova:creationTime>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        <nova:port uuid="ccc62a21-60d5-4151-8ab5-c33149100cd0">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <entry name="serial">ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <entry name="uuid">ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.config"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:a3:d6:a7"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <target dev="tapccc62a21-60"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log" append="off"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:25:50 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:25:50 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:25:50 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:25:50 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.351 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Preparing to wait for external event network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.351 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.352 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.352 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.352 187256 DEBUG nova.virt.libvirt.vif [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:25:44Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.353 187256 DEBUG nova.network.os_vif_util [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.353 187256 DEBUG nova.network.os_vif_util [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:d6:a7,bridge_name='br-int',has_traffic_filtering=True,id=ccc62a21-60d5-4151-8ab5-c33149100cd0,network=Network(577c0581-66ed-41fb-8a29-0e25a0007ac2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccc62a21-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.353 187256 DEBUG os_vif [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:d6:a7,bridge_name='br-int',has_traffic_filtering=True,id=ccc62a21-60d5-4151-8ab5-c33149100cd0,network=Network(577c0581-66ed-41fb-8a29-0e25a0007ac2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccc62a21-60') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.354 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.354 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.355 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.357 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.358 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapccc62a21-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.358 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapccc62a21-60, col_values=(('external_ids', {'iface-id': 'ccc62a21-60d5-4151-8ab5-c33149100cd0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:d6:a7', 'vm-uuid': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.359 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:50 np0005538960 NetworkManager[55548]: <info>  [1764347150.3608] manager: (tapccc62a21-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.362 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.367 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.368 187256 INFO os_vif [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:d6:a7,bridge_name='br-int',has_traffic_filtering=True,id=ccc62a21-60d5-4151-8ab5-c33149100cd0,network=Network(577c0581-66ed-41fb-8a29-0e25a0007ac2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccc62a21-60')#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.408 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.408 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.409 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No VIF found with MAC fa:16:3e:a3:d6:a7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.409 187256 INFO nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Using config drive#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.849 187256 INFO nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Creating config drive at /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.config#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.854 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8oxx1xiu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:25:50 np0005538960 nova_compute[187252]: 2025-11-28 16:25:50.981 187256 DEBUG oslo_concurrency.processutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8oxx1xiu" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:25:51 np0005538960 kernel: tapccc62a21-60: entered promiscuous mode
Nov 28 11:25:51 np0005538960 NetworkManager[55548]: <info>  [1764347151.0532] manager: (tapccc62a21-60): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Nov 28 11:25:51 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:51Z|00111|binding|INFO|Claiming lport ccc62a21-60d5-4151-8ab5-c33149100cd0 for this chassis.
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.053 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:51 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:51Z|00112|binding|INFO|ccc62a21-60d5-4151-8ab5-c33149100cd0: Claiming fa:16:3e:a3:d6:a7 10.100.0.14
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.057 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.067 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:51 np0005538960 systemd-udevd[218347]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:25:51 np0005538960 systemd-machined[153518]: New machine qemu-9-instance-0000001a.
Nov 28 11:25:51 np0005538960 NetworkManager[55548]: <info>  [1764347151.1043] device (tapccc62a21-60): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:25:51 np0005538960 NetworkManager[55548]: <info>  [1764347151.1058] device (tapccc62a21-60): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.121 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:51 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:51Z|00113|binding|INFO|Setting lport ccc62a21-60d5-4151-8ab5-c33149100cd0 ovn-installed in OVS
Nov 28 11:25:51 np0005538960 systemd[1]: Started Virtual Machine qemu-9-instance-0000001a.
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.126 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:51 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:51Z|00114|binding|INFO|Setting lport ccc62a21-60d5-4151-8ab5-c33149100cd0 up in Southbound
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.185 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:d6:a7 10.100.0.14'], port_security=['fa:16:3e:a3:d6:a7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-577c0581-66ed-41fb-8a29-0e25a0007ac2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'da1cc23b-094e-481a-bce5-3cc0ef981d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5b2475e-34ed-477b-9001-b455c0e4d7e2, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=ccc62a21-60d5-4151-8ab5-c33149100cd0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.187 104369 INFO neutron.agent.ovn.metadata.agent [-] Port ccc62a21-60d5-4151-8ab5-c33149100cd0 in datapath 577c0581-66ed-41fb-8a29-0e25a0007ac2 bound to our chassis#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.188 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 577c0581-66ed-41fb-8a29-0e25a0007ac2#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.203 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e02d3439-175a-4360-a11a-fff4e0aadf75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.205 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap577c0581-61 in ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.208 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap577c0581-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.208 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[42dc76ef-f457-4aba-acb3-4d63b4832956]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.210 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[8787bd04-159e-4043-b3ea-a2db6949cbf9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.222 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[3522043d-5b94-435e-9fb1-625f65f69477]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.250 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[41e5ce80-4ddf-41e0-96c9-f11e8b87cd3b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.284 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[d7ba8666-c391-4b17-a3f0-fb2a06b34087]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 systemd-udevd[218350]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:25:51 np0005538960 NetworkManager[55548]: <info>  [1764347151.2938] manager: (tap577c0581-60): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.291 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d9587912-4431-4362-9408-05302c15dbe0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.336 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[3532c641-b8bc-4009-91b4-00abac2a1e98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.340 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[34764ee5-9f8f-4f93-b716-5cf6444b0c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 NetworkManager[55548]: <info>  [1764347151.3709] device (tap577c0581-60): carrier: link connected
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.374 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[b30e31fe-310e-4dd3-b4a2-9d10ea508a83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.395 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2ef00c66-470b-46c9-bd0a-9a4fc1fb025b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap577c0581-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:36:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425094, 'reachable_time': 37614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218381, 'error': None, 'target': 'ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.416 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b988d093-b612-43bf-8e52-d0f8e2c119a5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:36be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 425094, 'tstamp': 425094}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218384, 'error': None, 'target': 'ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.435 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d939e5aa-167f-4bfc-ac92-7a3871baaf61]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap577c0581-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:36:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 35], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425094, 'reachable_time': 37614, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218389, 'error': None, 'target': 'ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.467 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c0088739-d9aa-48c5-8a59-0ff5af1e7e2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.500 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347151.5000176, ecbea330-ccac-4a01-a80b-0c10a2f686e2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.501 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] VM Started (Lifecycle Event)#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.524 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.524 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[894ed6eb-ebc5-4a31-a634-d3fde6980bdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.525 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap577c0581-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.526 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.526 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap577c0581-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:51 np0005538960 NetworkManager[55548]: <info>  [1764347151.5299] manager: (tap577c0581-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Nov 28 11:25:51 np0005538960 kernel: tap577c0581-60: entered promiscuous mode
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.532 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.533 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap577c0581-60, col_values=(('external_ids', {'iface-id': '065b851d-69a4-49d0-a066-f5c141f99961'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:25:51 np0005538960 ovn_controller[95460]: 2025-11-28T16:25:51Z|00115|binding|INFO|Releasing lport 065b851d-69a4-49d0-a066-f5c141f99961 from this chassis (sb_readonly=0)
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.536 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.537 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/577c0581-66ed-41fb-8a29-0e25a0007ac2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/577c0581-66ed-41fb-8a29-0e25a0007ac2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.537 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347151.5010233, ecbea330-ccac-4a01-a80b-0c10a2f686e2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.538 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.538 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fea7066e-ed72-437a-8725-b9cfb7095657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.539 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-577c0581-66ed-41fb-8a29-0e25a0007ac2
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/577c0581-66ed-41fb-8a29-0e25a0007ac2.pid.haproxy
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID 577c0581-66ed-41fb-8a29-0e25a0007ac2
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:25:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:25:51.540 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2', 'env', 'PROCESS_TAG=haproxy-577c0581-66ed-41fb-8a29-0e25a0007ac2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/577c0581-66ed-41fb-8a29-0e25a0007ac2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.550 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.553 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.556 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.573 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.736 187256 DEBUG nova.network.neutron [req-2905125d-2f46-4f24-9473-6ceb4ce776f7 req-9d2613ff-49d3-4c72-a3cf-b945128dd04c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updated VIF entry in instance network info cache for port ccc62a21-60d5-4151-8ab5-c33149100cd0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.736 187256 DEBUG nova.network.neutron [req-2905125d-2f46-4f24-9473-6ceb4ce776f7 req-9d2613ff-49d3-4c72-a3cf-b945128dd04c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:25:51 np0005538960 nova_compute[187252]: 2025-11-28 16:25:51.750 187256 DEBUG oslo_concurrency.lockutils [req-2905125d-2f46-4f24-9473-6ceb4ce776f7 req-9d2613ff-49d3-4c72-a3cf-b945128dd04c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:25:51 np0005538960 podman[218422]: 2025-11-28 16:25:51.951212251 +0000 UTC m=+0.058211819 container create fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 11:25:51 np0005538960 systemd[1]: Started libpod-conmon-fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e.scope.
Nov 28 11:25:52 np0005538960 podman[218422]: 2025-11-28 16:25:51.917175422 +0000 UTC m=+0.024175010 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:25:52 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:25:52 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b66ce6399351065bf1c77a1efda1b1cff97cc642369591071f80a7143088280e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:25:52 np0005538960 podman[218422]: 2025-11-28 16:25:52.042980418 +0000 UTC m=+0.149980016 container init fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:25:52 np0005538960 podman[218422]: 2025-11-28 16:25:52.050423099 +0000 UTC m=+0.157422667 container start fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:25:52 np0005538960 neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2[218439]: [NOTICE]   (218443) : New worker (218445) forked
Nov 28 11:25:52 np0005538960 neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2[218439]: [NOTICE]   (218443) : Loading success.
Nov 28 11:25:52 np0005538960 nova_compute[187252]: 2025-11-28 16:25:52.883 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.024 187256 DEBUG nova.compute.manager [req-c382feae-aeb1-4aff-8ddd-ad0e371eb252 req-4e7ec790-1a1e-4b03-af21-835e620cd911 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.024 187256 DEBUG oslo_concurrency.lockutils [req-c382feae-aeb1-4aff-8ddd-ad0e371eb252 req-4e7ec790-1a1e-4b03-af21-835e620cd911 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.025 187256 DEBUG oslo_concurrency.lockutils [req-c382feae-aeb1-4aff-8ddd-ad0e371eb252 req-4e7ec790-1a1e-4b03-af21-835e620cd911 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.025 187256 DEBUG oslo_concurrency.lockutils [req-c382feae-aeb1-4aff-8ddd-ad0e371eb252 req-4e7ec790-1a1e-4b03-af21-835e620cd911 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.025 187256 DEBUG nova.compute.manager [req-c382feae-aeb1-4aff-8ddd-ad0e371eb252 req-4e7ec790-1a1e-4b03-af21-835e620cd911 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Processing event network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.026 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.032 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347155.0318341, ecbea330-ccac-4a01-a80b-0c10a2f686e2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.032 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.036 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.043 187256 INFO nova.virt.libvirt.driver [-] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Instance spawned successfully.#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.044 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.051 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.056 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.065 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.066 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.072 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.072 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.073 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.073 187256 DEBUG nova.virt.libvirt.driver [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.079 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.133 187256 INFO nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Took 10.32 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.134 187256 DEBUG nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:25:55 np0005538960 podman[218456]: 2025-11-28 16:25:55.189186551 +0000 UTC m=+0.084705255 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.191 187256 INFO nova.compute.manager [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Took 10.79 seconds to build instance.#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.211 187256 DEBUG oslo_concurrency.lockutils [None req-7ca0ccd7-60f3-417b-877f-76a63926a005 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:55 np0005538960 nova_compute[187252]: 2025-11-28 16:25:55.361 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:57 np0005538960 nova_compute[187252]: 2025-11-28 16:25:57.885 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:25:57 np0005538960 nova_compute[187252]: 2025-11-28 16:25:57.974 187256 DEBUG nova.compute.manager [req-441699fb-7d46-45a2-8b09-18c68928d4f6 req-09fa18f2-db1c-4439-8579-8066acabfbbd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:25:57 np0005538960 nova_compute[187252]: 2025-11-28 16:25:57.975 187256 DEBUG oslo_concurrency.lockutils [req-441699fb-7d46-45a2-8b09-18c68928d4f6 req-09fa18f2-db1c-4439-8579-8066acabfbbd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:25:57 np0005538960 nova_compute[187252]: 2025-11-28 16:25:57.975 187256 DEBUG oslo_concurrency.lockutils [req-441699fb-7d46-45a2-8b09-18c68928d4f6 req-09fa18f2-db1c-4439-8579-8066acabfbbd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:25:57 np0005538960 nova_compute[187252]: 2025-11-28 16:25:57.975 187256 DEBUG oslo_concurrency.lockutils [req-441699fb-7d46-45a2-8b09-18c68928d4f6 req-09fa18f2-db1c-4439-8579-8066acabfbbd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:25:57 np0005538960 nova_compute[187252]: 2025-11-28 16:25:57.975 187256 DEBUG nova.compute.manager [req-441699fb-7d46-45a2-8b09-18c68928d4f6 req-09fa18f2-db1c-4439-8579-8066acabfbbd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] No waiting events found dispatching network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:25:57 np0005538960 nova_compute[187252]: 2025-11-28 16:25:57.975 187256 WARNING nova.compute.manager [req-441699fb-7d46-45a2-8b09-18c68928d4f6 req-09fa18f2-db1c-4439-8579-8066acabfbbd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received unexpected event network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:25:58 np0005538960 podman[218473]: 2025-11-28 16:25:58.162118741 +0000 UTC m=+0.059701836 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:26:00 np0005538960 nova_compute[187252]: 2025-11-28 16:26:00.365 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:00 np0005538960 nova_compute[187252]: 2025-11-28 16:26:00.467 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:00 np0005538960 NetworkManager[55548]: <info>  [1764347160.4683] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Nov 28 11:26:00 np0005538960 NetworkManager[55548]: <info>  [1764347160.4691] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Nov 28 11:26:00 np0005538960 nova_compute[187252]: 2025-11-28 16:26:00.630 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:00 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:00Z|00116|binding|INFO|Releasing lport 065b851d-69a4-49d0-a066-f5c141f99961 from this chassis (sb_readonly=0)
Nov 28 11:26:00 np0005538960 nova_compute[187252]: 2025-11-28 16:26:00.656 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:02 np0005538960 nova_compute[187252]: 2025-11-28 16:26:02.667 187256 DEBUG nova.compute.manager [req-e4654a08-deb9-4e32-8249-b544c1427a39 req-23867849-c931-4922-abdf-049aa616e654 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-changed-ccc62a21-60d5-4151-8ab5-c33149100cd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:02 np0005538960 nova_compute[187252]: 2025-11-28 16:26:02.669 187256 DEBUG nova.compute.manager [req-e4654a08-deb9-4e32-8249-b544c1427a39 req-23867849-c931-4922-abdf-049aa616e654 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing instance network info cache due to event network-changed-ccc62a21-60d5-4151-8ab5-c33149100cd0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:26:02 np0005538960 nova_compute[187252]: 2025-11-28 16:26:02.670 187256 DEBUG oslo_concurrency.lockutils [req-e4654a08-deb9-4e32-8249-b544c1427a39 req-23867849-c931-4922-abdf-049aa616e654 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:26:02 np0005538960 nova_compute[187252]: 2025-11-28 16:26:02.670 187256 DEBUG oslo_concurrency.lockutils [req-e4654a08-deb9-4e32-8249-b544c1427a39 req-23867849-c931-4922-abdf-049aa616e654 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:26:02 np0005538960 nova_compute[187252]: 2025-11-28 16:26:02.670 187256 DEBUG nova.network.neutron [req-e4654a08-deb9-4e32-8249-b544c1427a39 req-23867849-c931-4922-abdf-049aa616e654 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing network info cache for port ccc62a21-60d5-4151-8ab5-c33149100cd0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:26:02 np0005538960 nova_compute[187252]: 2025-11-28 16:26:02.888 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:04 np0005538960 nova_compute[187252]: 2025-11-28 16:26:04.652 187256 DEBUG nova.network.neutron [req-e4654a08-deb9-4e32-8249-b544c1427a39 req-23867849-c931-4922-abdf-049aa616e654 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updated VIF entry in instance network info cache for port ccc62a21-60d5-4151-8ab5-c33149100cd0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:26:04 np0005538960 nova_compute[187252]: 2025-11-28 16:26:04.653 187256 DEBUG nova.network.neutron [req-e4654a08-deb9-4e32-8249-b544c1427a39 req-23867849-c931-4922-abdf-049aa616e654 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:26:04 np0005538960 nova_compute[187252]: 2025-11-28 16:26:04.692 187256 DEBUG oslo_concurrency.lockutils [req-e4654a08-deb9-4e32-8249-b544c1427a39 req-23867849-c931-4922-abdf-049aa616e654 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:26:05 np0005538960 podman[218499]: 2025-11-28 16:26:05.184419865 +0000 UTC m=+0.085348161 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 11:26:05 np0005538960 nova_compute[187252]: 2025-11-28 16:26:05.367 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:05 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:05Z|00117|binding|INFO|Releasing lport 065b851d-69a4-49d0-a066-f5c141f99961 from this chassis (sb_readonly=0)
Nov 28 11:26:05 np0005538960 nova_compute[187252]: 2025-11-28 16:26:05.479 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:05.928 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:26:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:05.930 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:26:05 np0005538960 nova_compute[187252]: 2025-11-28 16:26:05.931 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:06.346 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:06.348 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:06.349 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:07 np0005538960 nova_compute[187252]: 2025-11-28 16:26:07.891 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:08 np0005538960 podman[218549]: 2025-11-28 16:26:08.155820309 +0000 UTC m=+0.057166165 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Nov 28 11:26:08 np0005538960 podman[218548]: 2025-11-28 16:26:08.163131726 +0000 UTC m=+0.066843680 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:26:08 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:08Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:d6:a7 10.100.0.14
Nov 28 11:26:08 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:08Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:d6:a7 10.100.0.14
Nov 28 11:26:10 np0005538960 nova_compute[187252]: 2025-11-28 16:26:10.371 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:12 np0005538960 nova_compute[187252]: 2025-11-28 16:26:12.894 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:13 np0005538960 podman[218588]: 2025-11-28 16:26:13.17842447 +0000 UTC m=+0.081668861 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:26:13 np0005538960 nova_compute[187252]: 2025-11-28 16:26:13.359 187256 INFO nova.compute.manager [None req-7aee4262-38ca-4049-99de-1b513d99e948 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Get console output#033[00m
Nov 28 11:26:13 np0005538960 nova_compute[187252]: 2025-11-28 16:26:13.365 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:26:15 np0005538960 nova_compute[187252]: 2025-11-28 16:26:15.375 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:15 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:15.932 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:16 np0005538960 nova_compute[187252]: 2025-11-28 16:26:16.259 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:16 np0005538960 nova_compute[187252]: 2025-11-28 16:26:16.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:17 np0005538960 podman[218614]: 2025-11-28 16:26:17.160325177 +0000 UTC m=+0.058999778 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 28 11:26:17 np0005538960 nova_compute[187252]: 2025-11-28 16:26:17.326 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:17 np0005538960 nova_compute[187252]: 2025-11-28 16:26:17.326 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 11:26:17 np0005538960 nova_compute[187252]: 2025-11-28 16:26:17.896 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:18 np0005538960 nova_compute[187252]: 2025-11-28 16:26:18.341 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:18 np0005538960 nova_compute[187252]: 2025-11-28 16:26:18.342 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:26:18 np0005538960 nova_compute[187252]: 2025-11-28 16:26:18.485 187256 DEBUG oslo_concurrency.lockutils [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "interface-ecbea330-ccac-4a01-a80b-0c10a2f686e2-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:18 np0005538960 nova_compute[187252]: 2025-11-28 16:26:18.486 187256 DEBUG oslo_concurrency.lockutils [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "interface-ecbea330-ccac-4a01-a80b-0c10a2f686e2-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:18 np0005538960 nova_compute[187252]: 2025-11-28 16:26:18.487 187256 DEBUG nova.objects.instance [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'flavor' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.059 187256 DEBUG nova.objects.instance [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'pci_requests' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.074 187256 DEBUG nova.network.neutron [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.342 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.342 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.343 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.343 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.440 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.507 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.508 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.574 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.725 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.726 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5578MB free_disk=73.31366729736328GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.727 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.727 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.800 187256 DEBUG nova.policy [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.928 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance ecbea330-ccac-4a01-a80b-0c10a2f686e2 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.928 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:26:19 np0005538960 nova_compute[187252]: 2025-11-28 16:26:19.929 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:26:20 np0005538960 nova_compute[187252]: 2025-11-28 16:26:20.087 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:26:20 np0005538960 nova_compute[187252]: 2025-11-28 16:26:20.102 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:26:20 np0005538960 nova_compute[187252]: 2025-11-28 16:26:20.140 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:26:20 np0005538960 nova_compute[187252]: 2025-11-28 16:26:20.141 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.414s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:20 np0005538960 nova_compute[187252]: 2025-11-28 16:26:20.378 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:22 np0005538960 nova_compute[187252]: 2025-11-28 16:26:22.142 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:22 np0005538960 nova_compute[187252]: 2025-11-28 16:26:22.143 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:26:22 np0005538960 nova_compute[187252]: 2025-11-28 16:26:22.143 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:26:22 np0005538960 nova_compute[187252]: 2025-11-28 16:26:22.897 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:23 np0005538960 nova_compute[187252]: 2025-11-28 16:26:23.740 187256 INFO nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating ports in neutron#033[00m
Nov 28 11:26:24 np0005538960 nova_compute[187252]: 2025-11-28 16:26:24.742 187256 INFO nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating port c5a5eead-793d-43f8-8cc3-a792eb3d80f8 with attributes {'binding:host_id': 'compute-1.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Nov 28 11:26:24 np0005538960 nova_compute[187252]: 2025-11-28 16:26:24.877 187256 DEBUG nova.network.neutron [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Successfully created port: c5a5eead-793d-43f8-8cc3-a792eb3d80f8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:26:25 np0005538960 nova_compute[187252]: 2025-11-28 16:26:25.382 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:26 np0005538960 podman[218641]: 2025-11-28 16:26:26.167099284 +0000 UTC m=+0.071247328 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 11:26:26 np0005538960 nova_compute[187252]: 2025-11-28 16:26:26.612 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:26 np0005538960 nova_compute[187252]: 2025-11-28 16:26:26.851 187256 DEBUG nova.network.neutron [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Successfully updated port: c5a5eead-793d-43f8-8cc3-a792eb3d80f8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:26:26 np0005538960 nova_compute[187252]: 2025-11-28 16:26:26.906 187256 DEBUG oslo_concurrency.lockutils [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:26:26 np0005538960 nova_compute[187252]: 2025-11-28 16:26:26.906 187256 DEBUG oslo_concurrency.lockutils [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:26:26 np0005538960 nova_compute[187252]: 2025-11-28 16:26:26.907 187256 DEBUG nova.network.neutron [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:26:27 np0005538960 nova_compute[187252]: 2025-11-28 16:26:27.180 187256 DEBUG nova.compute.manager [req-ff9cc577-7d5b-48e2-9876-29a5b1c40f16 req-47a19da6-b5d9-4aee-b73e-f7bdc83f6cca 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-changed-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:27 np0005538960 nova_compute[187252]: 2025-11-28 16:26:27.180 187256 DEBUG nova.compute.manager [req-ff9cc577-7d5b-48e2-9876-29a5b1c40f16 req-47a19da6-b5d9-4aee-b73e-f7bdc83f6cca 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing instance network info cache due to event network-changed-c5a5eead-793d-43f8-8cc3-a792eb3d80f8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:26:27 np0005538960 nova_compute[187252]: 2025-11-28 16:26:27.180 187256 DEBUG oslo_concurrency.lockutils [req-ff9cc577-7d5b-48e2-9876-29a5b1c40f16 req-47a19da6-b5d9-4aee-b73e-f7bdc83f6cca 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:26:27 np0005538960 nova_compute[187252]: 2025-11-28 16:26:27.746 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:26:27 np0005538960 nova_compute[187252]: 2025-11-28 16:26:27.899 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:29 np0005538960 podman[218661]: 2025-11-28 16:26:29.146083912 +0000 UTC m=+0.054355365 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:26:29 np0005538960 nova_compute[187252]: 2025-11-28 16:26:29.385 187256 DEBUG nova.compute.manager [req-55ea5351-315b-4756-a1d1-fe662e711b7a req-6df72292-aadf-48ef-908b-9a2e52fb140f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-changed-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:29 np0005538960 nova_compute[187252]: 2025-11-28 16:26:29.385 187256 DEBUG nova.compute.manager [req-55ea5351-315b-4756-a1d1-fe662e711b7a req-6df72292-aadf-48ef-908b-9a2e52fb140f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing instance network info cache due to event network-changed-c5a5eead-793d-43f8-8cc3-a792eb3d80f8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:26:29 np0005538960 nova_compute[187252]: 2025-11-28 16:26:29.386 187256 DEBUG oslo_concurrency.lockutils [req-55ea5351-315b-4756-a1d1-fe662e711b7a req-6df72292-aadf-48ef-908b-9a2e52fb140f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.385 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.857 187256 DEBUG nova.network.neutron [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.877 187256 DEBUG oslo_concurrency.lockutils [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.878 187256 DEBUG oslo_concurrency.lockutils [req-ff9cc577-7d5b-48e2-9876-29a5b1c40f16 req-47a19da6-b5d9-4aee-b73e-f7bdc83f6cca 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.879 187256 DEBUG nova.network.neutron [req-ff9cc577-7d5b-48e2-9876-29a5b1c40f16 req-47a19da6-b5d9-4aee-b73e-f7bdc83f6cca 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing network info cache for port c5a5eead-793d-43f8-8cc3-a792eb3d80f8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.882 187256 DEBUG nova.virt.libvirt.vif [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:25:55Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.883 187256 DEBUG nova.network.os_vif_util [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.884 187256 DEBUG nova.network.os_vif_util [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.884 187256 DEBUG os_vif [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.885 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.885 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.885 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.888 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.889 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5a5eead-79, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.889 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc5a5eead-79, col_values=(('external_ids', {'iface-id': 'c5a5eead-793d-43f8-8cc3-a792eb3d80f8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:6a:b5', 'vm-uuid': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:30 np0005538960 NetworkManager[55548]: <info>  [1764347190.8918] manager: (tapc5a5eead-79): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.893 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.896 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.898 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.899 187256 INFO os_vif [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79')#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.900 187256 DEBUG nova.virt.libvirt.vif [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:25:55Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.901 187256 DEBUG nova.network.os_vif_util [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.901 187256 DEBUG nova.network.os_vif_util [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.904 187256 DEBUG nova.virt.libvirt.guest [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] attach device xml: <interface type="ethernet">
Nov 28 11:26:30 np0005538960 nova_compute[187252]:  <mac address="fa:16:3e:c0:6a:b5"/>
Nov 28 11:26:30 np0005538960 nova_compute[187252]:  <model type="virtio"/>
Nov 28 11:26:30 np0005538960 nova_compute[187252]:  <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:26:30 np0005538960 nova_compute[187252]:  <mtu size="1442"/>
Nov 28 11:26:30 np0005538960 nova_compute[187252]:  <target dev="tapc5a5eead-79"/>
Nov 28 11:26:30 np0005538960 nova_compute[187252]: </interface>
Nov 28 11:26:30 np0005538960 nova_compute[187252]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Nov 28 11:26:30 np0005538960 kernel: tapc5a5eead-79: entered promiscuous mode
Nov 28 11:26:30 np0005538960 NetworkManager[55548]: <info>  [1764347190.9155] manager: (tapc5a5eead-79): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.918 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:30Z|00118|binding|INFO|Claiming lport c5a5eead-793d-43f8-8cc3-a792eb3d80f8 for this chassis.
Nov 28 11:26:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:30Z|00119|binding|INFO|c5a5eead-793d-43f8-8cc3-a792eb3d80f8: Claiming fa:16:3e:c0:6a:b5 10.100.0.27
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.932 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:6a:b5 10.100.0.27'], port_security=['fa:16:3e:c0:6a:b5 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f663f661-7b7b-4edb-989d-ff8406790f59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '3', 'neutron:security_group_ids': '4551c4f3-bc67-4846-a687-f0bfbc9398f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a41596a-bfeb-4aa8-962b-05fdfeac8603, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=c5a5eead-793d-43f8-8cc3-a792eb3d80f8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.934 104369 INFO neutron.agent.ovn.metadata.agent [-] Port c5a5eead-793d-43f8-8cc3-a792eb3d80f8 in datapath f663f661-7b7b-4edb-989d-ff8406790f59 bound to our chassis#033[00m
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.936 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f663f661-7b7b-4edb-989d-ff8406790f59#033[00m
Nov 28 11:26:30 np0005538960 systemd-udevd[218694]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.949 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9a18178f-d252-4a16-a276-27e2cf56400e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.950 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf663f661-71 in ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.954 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf663f661-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.954 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fafe5f75-c2b2-4cd6-b7a9-d56542eb7609]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.955 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e70afb2f-3f11-4c2b-9c55-246ec2bb91d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.956 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:30Z|00120|binding|INFO|Setting lport c5a5eead-793d-43f8-8cc3-a792eb3d80f8 ovn-installed in OVS
Nov 28 11:26:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:30Z|00121|binding|INFO|Setting lport c5a5eead-793d-43f8-8cc3-a792eb3d80f8 up in Southbound
Nov 28 11:26:30 np0005538960 nova_compute[187252]: 2025-11-28 16:26:30.960 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:30 np0005538960 NetworkManager[55548]: <info>  [1764347190.9669] device (tapc5a5eead-79): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:26:30 np0005538960 NetworkManager[55548]: <info>  [1764347190.9676] device (tapc5a5eead-79): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.969 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[415af852-d69a-4377-905d-cc5156c94b77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:30.986 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2be71364-d605-4b63-a99f-a64eb2007315]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.018 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[1487b884-7164-4cc2-ac92-0f5e48503da0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.024 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[68dc5840-6a09-46f9-946c-1e33c6b2b0d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 NetworkManager[55548]: <info>  [1764347191.0262] manager: (tapf663f661-70): new Veth device (/org/freedesktop/NetworkManager/Devices/71)
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.061 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9b6fcc-774f-4a40-af81-7a4d94896fa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.066 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[cd72a97c-b364-4f83-bce0-42a0dd7d451b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 NetworkManager[55548]: <info>  [1764347191.0928] device (tapf663f661-70): carrier: link connected
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.099 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[86312f5d-401e-47e4-8247-d04dd9f1064f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.118 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d199fd-e896-43e5-b0a2-8ce4904c7364]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf663f661-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:43:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429067, 'reachable_time': 41663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218720, 'error': None, 'target': 'ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.137 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[64c7cbb6-2c19-4d6d-8b40-cee39fea21ad]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe73:432e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 429067, 'tstamp': 429067}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218721, 'error': None, 'target': 'ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.140 187256 DEBUG nova.virt.libvirt.driver [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.140 187256 DEBUG nova.virt.libvirt.driver [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.141 187256 DEBUG nova.virt.libvirt.driver [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No VIF found with MAC fa:16:3e:a3:d6:a7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.141 187256 DEBUG nova.virt.libvirt.driver [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No VIF found with MAC fa:16:3e:c0:6a:b5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.154 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d74cc3d7-0d0a-4e29-8cf9-34584dfbfcb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf663f661-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:73:43:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429067, 'reachable_time': 41663, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218722, 'error': None, 'target': 'ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.188 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a853a7a1-e68a-43e7-9b2a-aea78ff8223d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.189 187256 DEBUG nova.virt.libvirt.guest [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  <nova:name>tempest-TestNetworkBasicOps-server-1122100469</nova:name>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  <nova:creationTime>2025-11-28 16:26:31</nova:creationTime>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  <nova:flavor name="m1.nano">
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:memory>128</nova:memory>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:disk>1</nova:disk>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:swap>0</nova:swap>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:vcpus>1</nova:vcpus>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  </nova:flavor>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  <nova:owner>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  </nova:owner>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  <nova:ports>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:port uuid="ccc62a21-60d5-4151-8ab5-c33149100cd0">
Nov 28 11:26:31 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    <nova:port uuid="c5a5eead-793d-43f8-8cc3-a792eb3d80f8">
Nov 28 11:26:31 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:26:31 np0005538960 nova_compute[187252]:  </nova:ports>
Nov 28 11:26:31 np0005538960 nova_compute[187252]: </nova:instance>
Nov 28 11:26:31 np0005538960 nova_compute[187252]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.237 187256 DEBUG oslo_concurrency.lockutils [None req-818b5553-1221-4d34-b228-22b27b35c35f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "interface-ecbea330-ccac-4a01-a80b-0c10a2f686e2-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 12.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.256 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ac30e7f4-eac6-42c8-8d9f-ac13158a2336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.258 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf663f661-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.258 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.259 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf663f661-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.261 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:31 np0005538960 NetworkManager[55548]: <info>  [1764347191.2620] manager: (tapf663f661-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Nov 28 11:26:31 np0005538960 kernel: tapf663f661-70: entered promiscuous mode
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.265 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.268 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf663f661-70, col_values=(('external_ids', {'iface-id': '13f43c4e-df41-48fa-9d27-34cbd88f8c17'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:31 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:31Z|00122|binding|INFO|Releasing lport 13f43c4e-df41-48fa-9d27-34cbd88f8c17 from this chassis (sb_readonly=0)
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.270 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.271 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f663f661-7b7b-4edb-989d-ff8406790f59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f663f661-7b7b-4edb-989d-ff8406790f59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.272 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[39a183fb-8a83-446d-a505-7b59d519776e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.273 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-f663f661-7b7b-4edb-989d-ff8406790f59
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/f663f661-7b7b-4edb-989d-ff8406790f59.pid.haproxy
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID f663f661-7b7b-4edb-989d-ff8406790f59
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:26:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:31.274 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59', 'env', 'PROCESS_TAG=haproxy-f663f661-7b7b-4edb-989d-ff8406790f59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f663f661-7b7b-4edb-989d-ff8406790f59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.283 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.610 187256 DEBUG nova.compute.manager [req-f57e833b-a1ce-4141-91d7-e76e4318ccab req-828935e8-655c-4146-8d36-96f173c004cd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.611 187256 DEBUG oslo_concurrency.lockutils [req-f57e833b-a1ce-4141-91d7-e76e4318ccab req-828935e8-655c-4146-8d36-96f173c004cd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.611 187256 DEBUG oslo_concurrency.lockutils [req-f57e833b-a1ce-4141-91d7-e76e4318ccab req-828935e8-655c-4146-8d36-96f173c004cd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.612 187256 DEBUG oslo_concurrency.lockutils [req-f57e833b-a1ce-4141-91d7-e76e4318ccab req-828935e8-655c-4146-8d36-96f173c004cd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.612 187256 DEBUG nova.compute.manager [req-f57e833b-a1ce-4141-91d7-e76e4318ccab req-828935e8-655c-4146-8d36-96f173c004cd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] No waiting events found dispatching network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:26:31 np0005538960 nova_compute[187252]: 2025-11-28 16:26:31.612 187256 WARNING nova.compute.manager [req-f57e833b-a1ce-4141-91d7-e76e4318ccab req-828935e8-655c-4146-8d36-96f173c004cd 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received unexpected event network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:26:31 np0005538960 podman[218754]: 2025-11-28 16:26:31.693483292 +0000 UTC m=+0.062377482 container create 244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 11:26:31 np0005538960 systemd[1]: Started libpod-conmon-244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad.scope.
Nov 28 11:26:31 np0005538960 podman[218754]: 2025-11-28 16:26:31.66097253 +0000 UTC m=+0.029866750 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:26:31 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:26:31 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ba8d0cc65fb1b4bad4eb43253fa522e7f8d89472a8857fb8cd60de8023e023/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:26:31 np0005538960 podman[218754]: 2025-11-28 16:26:31.805671996 +0000 UTC m=+0.174566196 container init 244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 11:26:31 np0005538960 podman[218754]: 2025-11-28 16:26:31.812998334 +0000 UTC m=+0.181892524 container start 244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 11:26:31 np0005538960 neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59[218769]: [NOTICE]   (218773) : New worker (218775) forked
Nov 28 11:26:31 np0005538960 neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59[218769]: [NOTICE]   (218773) : Loading success.
Nov 28 11:26:32 np0005538960 nova_compute[187252]: 2025-11-28 16:26:32.846 187256 DEBUG nova.network.neutron [req-ff9cc577-7d5b-48e2-9876-29a5b1c40f16 req-47a19da6-b5d9-4aee-b73e-f7bdc83f6cca 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updated VIF entry in instance network info cache for port c5a5eead-793d-43f8-8cc3-a792eb3d80f8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:26:32 np0005538960 nova_compute[187252]: 2025-11-28 16:26:32.847 187256 DEBUG nova.network.neutron [req-ff9cc577-7d5b-48e2-9876-29a5b1c40f16 req-47a19da6-b5d9-4aee-b73e-f7bdc83f6cca 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:26:32 np0005538960 nova_compute[187252]: 2025-11-28 16:26:32.880 187256 DEBUG oslo_concurrency.lockutils [req-ff9cc577-7d5b-48e2-9876-29a5b1c40f16 req-47a19da6-b5d9-4aee-b73e-f7bdc83f6cca 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:26:32 np0005538960 nova_compute[187252]: 2025-11-28 16:26:32.881 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:26:32 np0005538960 nova_compute[187252]: 2025-11-28 16:26:32.881 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:26:32 np0005538960 nova_compute[187252]: 2025-11-28 16:26:32.881 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:26:32 np0005538960 nova_compute[187252]: 2025-11-28 16:26:32.902 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:33Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:6a:b5 10.100.0.27
Nov 28 11:26:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:33Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:6a:b5 10.100.0.27
Nov 28 11:26:34 np0005538960 nova_compute[187252]: 2025-11-28 16:26:34.709 187256 DEBUG nova.compute.manager [req-b3dd193f-762c-42b5-b0af-4c5b9aab857d req-e5a1f2b5-b0b4-4da5-8c16-989a5f0b4ee2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:34 np0005538960 nova_compute[187252]: 2025-11-28 16:26:34.710 187256 DEBUG oslo_concurrency.lockutils [req-b3dd193f-762c-42b5-b0af-4c5b9aab857d req-e5a1f2b5-b0b4-4da5-8c16-989a5f0b4ee2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:34 np0005538960 nova_compute[187252]: 2025-11-28 16:26:34.710 187256 DEBUG oslo_concurrency.lockutils [req-b3dd193f-762c-42b5-b0af-4c5b9aab857d req-e5a1f2b5-b0b4-4da5-8c16-989a5f0b4ee2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:34 np0005538960 nova_compute[187252]: 2025-11-28 16:26:34.710 187256 DEBUG oslo_concurrency.lockutils [req-b3dd193f-762c-42b5-b0af-4c5b9aab857d req-e5a1f2b5-b0b4-4da5-8c16-989a5f0b4ee2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:34 np0005538960 nova_compute[187252]: 2025-11-28 16:26:34.711 187256 DEBUG nova.compute.manager [req-b3dd193f-762c-42b5-b0af-4c5b9aab857d req-e5a1f2b5-b0b4-4da5-8c16-989a5f0b4ee2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] No waiting events found dispatching network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:26:34 np0005538960 nova_compute[187252]: 2025-11-28 16:26:34.711 187256 WARNING nova.compute.manager [req-b3dd193f-762c-42b5-b0af-4c5b9aab857d req-e5a1f2b5-b0b4-4da5-8c16-989a5f0b4ee2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received unexpected event network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.314 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'name': 'tempest-TestNetworkBasicOps-server-1122100469', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001a', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'user_id': 'a4105532118847f583e4bf7594336693', 'hostId': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.315 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.320 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ecbea330-ccac-4a01-a80b-0c10a2f686e2 / tapccc62a21-60 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.321 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for ecbea330-ccac-4a01-a80b-0c10a2f686e2 / tapc5a5eead-79 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.321 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.322 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7540ee0-4827-4726-b05a-03275d364da4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.315682', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01e18eda-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '86299db0c837f164df4a387c09e2777c87ff2f71e091fa4d9e2fc5795e031a93'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.315682', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01e19d3a-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': 'b7097c92d81e8cb992894e249d84034cd02672d5678af740d997f8eb6a1891f5'}]}, 'timestamp': '2025-11-28 16:26:35.322457', '_unique_id': '08f65091a7134973a4130af28fa2ad16'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.323 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.340 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.340 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49b79e3d-9e84-40ac-b277-e81b94a41902', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.324879', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01e463bc-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.959070014, 'message_signature': 'a469414310c9444088147cd888e24c06cb635ee3de42eca3c16a0356f2aae35a'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.324879', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01e46fec-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.959070014, 'message_signature': '7944556483784675b943ab33b7320f291b49ebe526c6c2fcab00421a6fb6012d'}]}, 'timestamp': '2025-11-28 16:26:35.340951', '_unique_id': '5e61471b6383435397a290f14e6d3fb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.342 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.343 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.343 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.packets volume: 144 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.343 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f1a7f2cf-77b0-4450-bb1f-e3bbd8a76414', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 144, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.343159', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01e4d34c-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': 'cb870557e6be48d890cce7adbe037bf20efa80ba249df6ed67d11aa5cc37a088'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.343159', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01e4dfe0-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '0218b14bda4ed0fbbead6f662872f9407533d558b888d18a9b5fcfd634407399'}]}, 'timestamp': '2025-11-28 16:26:35.343816', '_unique_id': '84b3e77870e94470abad9d0e4266230e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.344 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.345 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.373 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.write.requests volume: 307 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.374 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2badf312-aed4-497f-b66c-da5ed907b02a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 307, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.345464', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01e9965c-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '412e39e37dc92ab42ba465fe93b3b249b9f2332819f018d9a3e3b5975de84a1f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.345464', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01e9adc2-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '07d077b005d9a0798e7bac83c4305be806013d6bd3cbb4b8485f1b47bd790131'}]}, 'timestamp': '2025-11-28 16:26:35.375393', '_unique_id': '11918a8b52b84f82b9a0d45ba5d89c7e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.376 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.378 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.378 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.read.requests volume: 1090 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.378 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0692fda4-95de-4a56-b135-66b45da4444a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1090, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.378415', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01ea36c0-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': 'b7c72eb480237f79d0f74a2f51f34529c0e820c8bd7c953acddc1994bab0b7fe'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.378415', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01ea4980-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '7f8d4a33d485110a86c3c8e00a3f16533f376532a0d505004c3f9eb422e3849c'}]}, 'timestamp': '2025-11-28 16:26:35.379345', '_unique_id': '1b3675bb2909444db0e529327de92aee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.380 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.381 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.381 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.382 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eba6ba2e-e434-4dff-842a-8491c01441e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.381608', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01eab32a-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': 'ddb794cd9df182a2a1ba1c0a0d5f5ad0f1514f241d6bba92512e54be2a4b3266'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.381608', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01eac612-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '142253849119e3382da887d0a0456d2640c86d5402fd93066a21d252f7a6d631'}]}, 'timestamp': '2025-11-28 16:26:35.382553', '_unique_id': '7bd0b2345b44465eb1af146810c2b09f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.383 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.384 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.384 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.385 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '933c088c-f229-4511-9811-946f731d3f77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.384664', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01eb29b8-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '643f41792bb432b7904288311ecd05a725a1c5367f3d25b0ca5441be75d6c8e0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.384664', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01eb384a-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '944da58d1bb7d2bab22f3d445c459a9f049eca646ef707588183be10c780c6ee'}]}, 'timestamp': '2025-11-28 16:26:35.385439', '_unique_id': '6f0ff25fd7c8448182a4d641af2dbe40'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.387 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.403 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/cpu volume: 12690000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '352ba7a4-37f3-443f-9b62-c410911525e0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12690000000, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'timestamp': '2025-11-28T16:26:35.387198', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '01ee098a-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4295.037362961, 'message_signature': '3e8e5eb6a738bebd06afd65859a9519d860f87e46ca2115407441dd7b7a2c6ab'}]}, 'timestamp': '2025-11-28 16:26:35.403953', '_unique_id': '16fd5e19e0ac40e0a13be407220b538c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.405 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.bytes volume: 23478 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.406 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.bytes volume: 1242 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '847c8f8d-275c-4ddc-b34a-e749ed6a83ae', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23478, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.405862', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01ee63da-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '0db875ce6b1b0a67177537f15d79396123bf6fa10bf49ad21b0c6f4e29ff3328'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1242, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.405862', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01ee6ccc-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '3bb2d7c388364b440f43cc34e6c12aad6c42292541933c81563465fa0cc0e2bb'}]}, 'timestamp': '2025-11-28 16:26:35.406359', '_unique_id': '4bf84d0be94944a4b7ed970e350e9813'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.packets volume: 146 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.407 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.packets volume: 9 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b0aeaa3-9211-4948-84ca-b46dbaeb492e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 146, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.407645', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01eea854-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '7e130cfa15bb39a0d30d8be9cb38e406f6c1fa75a4f1604495c17806493a1cf0'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 9, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.407645', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01eeb268-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '9eec0cf61f79e35cb7b41f5e584ea0c9fddaca3f77efc9dd68910c54ada3b047'}]}, 'timestamp': '2025-11-28 16:26:35.408143', '_unique_id': 'e6948cb51c7f4e15b67d9d37394404d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.409 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.409 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.409 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'eb561ef2-92af-4dd1-8de5-114a77233e83', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.409465', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01eeef1c-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.959070014, 'message_signature': '39a113a6d53e2b5068165d1a89eb631764c01ab0a3ed082de07f69517f7ac645'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.409465', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01eef868-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.959070014, 'message_signature': '02c0f6d366322d4f6e2ace90b6e734ff718f0dd88786aaf9a79cf0242e07f11b'}]}, 'timestamp': '2025-11-28 16:26:35.409966', '_unique_id': '858b459fec874ded9f5ac42b9b5ea264'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.410 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.411 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.411 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.411 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b56ec33d-2aae-4d8e-b64d-5ac2c382ae53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.411161', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01ef3152-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '5dbf5d883825622d5549557697aa3bad0021f9a10f52598ededc19ad10340e4d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.411161', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01ef3bfc-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': 'abf4e3895d435ebacd579acad0adee9d2f14efe2ae1f2239bf6a00363645432d'}]}, 'timestamp': '2025-11-28 16:26:35.411671', '_unique_id': '8878bf197bc447399662292fce15c406'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.412 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.413 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1122100469>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1122100469>]
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.413 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.413 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/memory.usage volume: 42.47265625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36ddbf04-692a-4a72-846d-8fa88f1c3873', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.47265625, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'timestamp': '2025-11-28T16:26:35.413242', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '01ef82b0-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4295.037362961, 'message_signature': '0e789e4efbbac17029d72422686ca5d286806c910ab2c1c268254cbf2b166502'}]}, 'timestamp': '2025-11-28 16:26:35.413493', '_unique_id': '9e40aeb2a3584b3691c206d11a81c9aa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.414 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70522bcd-eed9-4079-a3e7-d2116cca4bcb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.414825', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01efc13a-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '3c8a24f091174c40795b29dcc6cb0bb1197ca7ec9d422de7b71409224db2e159'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.414825', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01efccca-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '4dc6e55c8c4664998589194953a591d13c51ede38bd552b83b3508e2937f8302'}]}, 'timestamp': '2025-11-28 16:26:35.415402', '_unique_id': '0e40248b26cf4b3d972bed9b67ec17ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.415 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.416 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.416 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.write.bytes volume: 72941568 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.416 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d498bb6-b739-4356-9ae4-8511b4d33612', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72941568, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.416542', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01f00348-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '251c8851d567b16a7647f49bf579ecb43ffc6f0b35e2441315a4194151054439'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.416542', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01f00b86-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '240a19f55b0cb3e675e118d0263cdc1d6410c21dae174d6d04b1d3f578ee4a71'}]}, 'timestamp': '2025-11-28 16:26:35.416987', '_unique_id': '2cfbe3d4bcad4c60b307d23dd590bc52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.417 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.418 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.418 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.bytes volume: 27899 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.418 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.bytes volume: 1330 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6370b1f1-07b1-4091-bc03-7127ddc81fc5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 27899, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.418294', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01f047ea-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '25e4a923271085c73bf0bb1bfdfe16800a3b792666daceafb8802814530ab254'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1330, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.418294', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01f05046-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '5148aea94a9519c984ee8511874fcdb659b329c004ac5e61052614dba79be742'}]}, 'timestamp': '2025-11-28 16:26:35.418731', '_unique_id': 'fd35abf17067451694a4b78158e9a38b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.419 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.420 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0967a85a-f093-41f3-ba4b-273734be3c66', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapccc62a21-60', 'timestamp': '2025-11-28T16:26:35.419960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapccc62a21-60', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a3:d6:a7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapccc62a21-60'}, 'message_id': '01f08912-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '81d62ec0cb3c576f335964e723e2c8134c9fffea00a69dfdb3ba8482b7802c62'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'instance-0000001a-ecbea330-ccac-4a01-a80b-0c10a2f686e2-tapc5a5eead-79', 'timestamp': '2025-11-28T16:26:35.419960', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'tapc5a5eead-79', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:6a:b5', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc5a5eead-79'}, 'message_id': '01f093b2-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.949855419, 'message_signature': '4c8191ba3263ca8aec8301ebd3824a59f6bce01fee31b78eef317addde6a6bb4'}]}, 'timestamp': '2025-11-28 16:26:35.420497', '_unique_id': '5417162d22e64801b70a94ec8ae21fdb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.422 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.422 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1122100469>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1122100469>]
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.422 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.422 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.422 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1122100469>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1122100469>]
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.422 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.422 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.423 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8692a95-d55c-4f8f-9394-085ee4ae5753', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.422816', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01f0f97e-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.959070014, 'message_signature': 'bd8995bb2fb628d03ac058977b90bf9fd0d57f570c97fa6bfaef853b3df900a3'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.422816', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01f1020c-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.959070014, 'message_signature': 'eaaf8c57b03d557965c51040e171858077e59b8fb5f4f6371b2ef06469abdd67'}]}, 'timestamp': '2025-11-28 16:26:35.423278', '_unique_id': '8f2f1fc73ee54600b09d3cf02dfd07ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.424 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.read.latency volume: 206311483 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.read.latency volume: 20470701 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7ad040bb-d8ca-4c00-b171-f5f6cbbceb08', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 206311483, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.424767', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01f14640-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '1c81a361b4413a55d8caaa56eff15a7619271db85e4910ecd74cb7749e42f7e0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 20470701, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.424767', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01f14ec4-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '1410dfe421c59382fa5dd69096bc895352d4ef0d50ad78bec05a91edb709544f'}]}, 'timestamp': '2025-11-28 16:26:35.425240', '_unique_id': 'b8ccc993ac3a4ea2ae25ec1f74dd6953'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.425 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.426 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.426 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.read.bytes volume: 30218752 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.426 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd146d17-4e7e-4477-83c1-52f4e4bc5e02', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30218752, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.426384', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01f1839e-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '1ee2c3e92bc74b344006d59fc4341466ca9bb0091b303dbb454c9944f689d397'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.426384', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01f18d94-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '85a746b52a93fd18224c22eaafc927016bf616a48be5c04fc26bd984b0802e83'}]}, 'timestamp': '2025-11-28 16:26:35.426861', '_unique_id': '6bc640b525fb4f7ea6a9b4ace5be93d7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.427 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.write.latency volume: 3336086940 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 DEBUG ceilometer.compute.pollsters [-] ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5ef0e83-92c4-46e7-b7be-7a92d559ff51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3336086940, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-vda', 'timestamp': '2025-11-28T16:26:35.427952', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '01f1c106-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': '867f1817187e5e10f6dae32b4f25eb0618952ae2e269f37faf33c5d6b43c218c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_name': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_name': None, 'resource_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2-sda', 'timestamp': '2025-11-28T16:26:35.427952', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1122100469', 'name': 'instance-0000001a', 'instance_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'instance_type': 'm1.nano', 'host': '6b318fe2effa6e39437768ee177787087420aecfbc17101e13300dad', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '01f1c994-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4294.979650604, 'message_signature': 'ca5038c27a42780950eb832922d7dd3e5d8c3e034fdc1e8b71fc5504fb78ab06'}]}, 'timestamp': '2025-11-28 16:26:35.428383', '_unique_id': '6f2a581f8e9b484ba2ade87b3c2b32fb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.428 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.429 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.429 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:26:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:26:35.429 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1122100469>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1122100469>]
Nov 28 11:26:35 np0005538960 nova_compute[187252]: 2025-11-28 16:26:35.891 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:36 np0005538960 podman[218784]: 2025-11-28 16:26:36.186039718 +0000 UTC m=+0.089618015 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251125)
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.554 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.580 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.580 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.581 187256 DEBUG oslo_concurrency.lockutils [req-55ea5351-315b-4756-a1d1-fe662e711b7a req-6df72292-aadf-48ef-908b-9a2e52fb140f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.581 187256 DEBUG nova.network.neutron [req-55ea5351-315b-4756-a1d1-fe662e711b7a req-6df72292-aadf-48ef-908b-9a2e52fb140f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing network info cache for port c5a5eead-793d-43f8-8cc3-a792eb3d80f8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.582 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.583 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.583 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.583 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.583 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.584 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.584 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.627 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.795 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:26:37 np0005538960 nova_compute[187252]: 2025-11-28 16:26:37.905 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:39 np0005538960 podman[218811]: 2025-11-28 16:26:39.170538799 +0000 UTC m=+0.067321222 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd)
Nov 28 11:26:39 np0005538960 podman[218812]: 2025-11-28 16:26:39.191160622 +0000 UTC m=+0.075865340 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.414 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.415 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.440 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.557 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.558 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.565 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.566 187256 INFO nova.compute.claims [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.825 187256 DEBUG nova.compute.provider_tree [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.840 187256 DEBUG nova.scheduler.client.report [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.894 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.932 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.374s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:40 np0005538960 nova_compute[187252]: 2025-11-28 16:26:40.933 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.268 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.269 187256 DEBUG nova.network.neutron [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.334 187256 INFO nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.354 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.435 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.436 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.436 187256 INFO nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Creating image(s)#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.437 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "/var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.437 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "/var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.438 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "/var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.453 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.532 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.533 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.534 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.545 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.610 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.611 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.655 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.657 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.657 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.723 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.724 187256 DEBUG nova.virt.disk.api [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Checking if we can resize image /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.725 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.791 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.792 187256 DEBUG nova.virt.disk.api [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Cannot resize image /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.793 187256 DEBUG nova.objects.instance [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'migration_context' on Instance uuid 4a3166d7-3401-4240-94f1-f56e885f648e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.806 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.806 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Ensure instance console log exists: /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.807 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.807 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.807 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.816 187256 DEBUG nova.network.neutron [req-55ea5351-315b-4756-a1d1-fe662e711b7a req-6df72292-aadf-48ef-908b-9a2e52fb140f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updated VIF entry in instance network info cache for port c5a5eead-793d-43f8-8cc3-a792eb3d80f8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.817 187256 DEBUG nova.network.neutron [req-55ea5351-315b-4756-a1d1-fe662e711b7a req-6df72292-aadf-48ef-908b-9a2e52fb140f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.829 187256 DEBUG nova.policy [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:26:41 np0005538960 nova_compute[187252]: 2025-11-28 16:26:41.833 187256 DEBUG oslo_concurrency.lockutils [req-55ea5351-315b-4756-a1d1-fe662e711b7a req-6df72292-aadf-48ef-908b-9a2e52fb140f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:26:42 np0005538960 nova_compute[187252]: 2025-11-28 16:26:42.938 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:43 np0005538960 nova_compute[187252]: 2025-11-28 16:26:43.132 187256 DEBUG nova.network.neutron [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Successfully created port: af92c5bb-bdd8-47de-9e20-776300662aa0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:26:44 np0005538960 nova_compute[187252]: 2025-11-28 16:26:44.007 187256 DEBUG nova.network.neutron [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Successfully created port: 4a8384e7-5273-441f-aa76-37047262d85e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:26:44 np0005538960 podman[218867]: 2025-11-28 16:26:44.151848913 +0000 UTC m=+0.058985208 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:26:45 np0005538960 nova_compute[187252]: 2025-11-28 16:26:45.607 187256 DEBUG nova.network.neutron [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Successfully updated port: af92c5bb-bdd8-47de-9e20-776300662aa0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:26:45 np0005538960 nova_compute[187252]: 2025-11-28 16:26:45.767 187256 DEBUG nova.compute.manager [req-012671ed-bae5-4222-9738-66d965a8f430 req-e306f577-36ff-4405-92e8-eabdebdfad4d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-changed-af92c5bb-bdd8-47de-9e20-776300662aa0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:45 np0005538960 nova_compute[187252]: 2025-11-28 16:26:45.768 187256 DEBUG nova.compute.manager [req-012671ed-bae5-4222-9738-66d965a8f430 req-e306f577-36ff-4405-92e8-eabdebdfad4d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Refreshing instance network info cache due to event network-changed-af92c5bb-bdd8-47de-9e20-776300662aa0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:26:45 np0005538960 nova_compute[187252]: 2025-11-28 16:26:45.769 187256 DEBUG oslo_concurrency.lockutils [req-012671ed-bae5-4222-9738-66d965a8f430 req-e306f577-36ff-4405-92e8-eabdebdfad4d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:26:45 np0005538960 nova_compute[187252]: 2025-11-28 16:26:45.769 187256 DEBUG oslo_concurrency.lockutils [req-012671ed-bae5-4222-9738-66d965a8f430 req-e306f577-36ff-4405-92e8-eabdebdfad4d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:26:45 np0005538960 nova_compute[187252]: 2025-11-28 16:26:45.769 187256 DEBUG nova.network.neutron [req-012671ed-bae5-4222-9738-66d965a8f430 req-e306f577-36ff-4405-92e8-eabdebdfad4d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Refreshing network info cache for port af92c5bb-bdd8-47de-9e20-776300662aa0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:26:45 np0005538960 nova_compute[187252]: 2025-11-28 16:26:45.896 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:46 np0005538960 nova_compute[187252]: 2025-11-28 16:26:46.776 187256 DEBUG nova.network.neutron [req-012671ed-bae5-4222-9738-66d965a8f430 req-e306f577-36ff-4405-92e8-eabdebdfad4d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:26:47 np0005538960 nova_compute[187252]: 2025-11-28 16:26:47.841 187256 DEBUG nova.network.neutron [req-012671ed-bae5-4222-9738-66d965a8f430 req-e306f577-36ff-4405-92e8-eabdebdfad4d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:26:47 np0005538960 nova_compute[187252]: 2025-11-28 16:26:47.858 187256 DEBUG oslo_concurrency.lockutils [req-012671ed-bae5-4222-9738-66d965a8f430 req-e306f577-36ff-4405-92e8-eabdebdfad4d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:26:47 np0005538960 nova_compute[187252]: 2025-11-28 16:26:47.942 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:48 np0005538960 nova_compute[187252]: 2025-11-28 16:26:48.161 187256 DEBUG nova.network.neutron [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Successfully updated port: 4a8384e7-5273-441f-aa76-37047262d85e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:26:48 np0005538960 nova_compute[187252]: 2025-11-28 16:26:48.190 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:26:48 np0005538960 nova_compute[187252]: 2025-11-28 16:26:48.190 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquired lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:26:48 np0005538960 nova_compute[187252]: 2025-11-28 16:26:48.190 187256 DEBUG nova.network.neutron [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:26:48 np0005538960 podman[218891]: 2025-11-28 16:26:48.194248967 +0000 UTC m=+0.093802717 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 11:26:48 np0005538960 nova_compute[187252]: 2025-11-28 16:26:48.344 187256 DEBUG nova.compute.manager [req-1b5d1bac-e3e8-4c49-afac-be8c229129c2 req-08392d48-0fa7-4d3d-965d-36ae125e3be1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-changed-4a8384e7-5273-441f-aa76-37047262d85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:48 np0005538960 nova_compute[187252]: 2025-11-28 16:26:48.344 187256 DEBUG nova.compute.manager [req-1b5d1bac-e3e8-4c49-afac-be8c229129c2 req-08392d48-0fa7-4d3d-965d-36ae125e3be1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Refreshing instance network info cache due to event network-changed-4a8384e7-5273-441f-aa76-37047262d85e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:26:48 np0005538960 nova_compute[187252]: 2025-11-28 16:26:48.345 187256 DEBUG oslo_concurrency.lockutils [req-1b5d1bac-e3e8-4c49-afac-be8c229129c2 req-08392d48-0fa7-4d3d-965d-36ae125e3be1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:26:48 np0005538960 nova_compute[187252]: 2025-11-28 16:26:48.561 187256 DEBUG nova.network.neutron [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:26:49 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:49Z|00123|binding|INFO|Releasing lport 13f43c4e-df41-48fa-9d27-34cbd88f8c17 from this chassis (sb_readonly=0)
Nov 28 11:26:49 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:49Z|00124|binding|INFO|Releasing lport 065b851d-69a4-49d0-a066-f5c141f99961 from this chassis (sb_readonly=0)
Nov 28 11:26:49 np0005538960 nova_compute[187252]: 2025-11-28 16:26:49.194 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:50 np0005538960 nova_compute[187252]: 2025-11-28 16:26:50.901 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:52 np0005538960 nova_compute[187252]: 2025-11-28 16:26:52.945 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.807 187256 DEBUG nova.network.neutron [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updating instance_info_cache with network_info: [{"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.836 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Releasing lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.837 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Instance network_info: |[{"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.838 187256 DEBUG oslo_concurrency.lockutils [req-1b5d1bac-e3e8-4c49-afac-be8c229129c2 req-08392d48-0fa7-4d3d-965d-36ae125e3be1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.838 187256 DEBUG nova.network.neutron [req-1b5d1bac-e3e8-4c49-afac-be8c229129c2 req-08392d48-0fa7-4d3d-965d-36ae125e3be1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Refreshing network info cache for port 4a8384e7-5273-441f-aa76-37047262d85e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.843 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Start _get_guest_xml network_info=[{"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.849 187256 WARNING nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.857 187256 DEBUG nova.virt.libvirt.host [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.858 187256 DEBUG nova.virt.libvirt.host [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.870 187256 DEBUG nova.virt.libvirt.host [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.871 187256 DEBUG nova.virt.libvirt.host [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.872 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.873 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.873 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.873 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.874 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.874 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.874 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.874 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.875 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.875 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.875 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.875 187256 DEBUG nova.virt.hardware [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.879 187256 DEBUG nova.virt.libvirt.vif [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:26:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1299525305',display_name='tempest-TestGettingAddress-server-1299525305',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1299525305',id=30,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG25bqRXMr3FU4IFmiWP4AKWa73QJj7Qe460t1LnUyUyJdk0XAT/L8jqfNUYoWQJfX3cLeLdAp3ZQ2H2sXeDrQd/UcTEwydPrQ/zJuzyzXV6hqrU4M+wB1dzDG7carbtrg==',key_name='tempest-TestGettingAddress-1714235428',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-ilkyg4d6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:26:41Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=4a3166d7-3401-4240-94f1-f56e885f648e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.879 187256 DEBUG nova.network.os_vif_util [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.880 187256 DEBUG nova.network.os_vif_util [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:48:99,bridge_name='br-int',has_traffic_filtering=True,id=af92c5bb-bdd8-47de-9e20-776300662aa0,network=Network(42968652-da0d-4f84-a781-24c009bf7324),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf92c5bb-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.881 187256 DEBUG nova.virt.libvirt.vif [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:26:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1299525305',display_name='tempest-TestGettingAddress-server-1299525305',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1299525305',id=30,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG25bqRXMr3FU4IFmiWP4AKWa73QJj7Qe460t1LnUyUyJdk0XAT/L8jqfNUYoWQJfX3cLeLdAp3ZQ2H2sXeDrQd/UcTEwydPrQ/zJuzyzXV6hqrU4M+wB1dzDG7carbtrg==',key_name='tempest-TestGettingAddress-1714235428',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-ilkyg4d6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:26:41Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=4a3166d7-3401-4240-94f1-f56e885f648e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.881 187256 DEBUG nova.network.os_vif_util [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.882 187256 DEBUG nova.network.os_vif_util [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:44:f0,bridge_name='br-int',has_traffic_filtering=True,id=4a8384e7-5273-441f-aa76-37047262d85e,network=Network(3052edd0-a739-4b9a-8e70-10ed6d70aba3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a8384e7-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.883 187256 DEBUG nova.objects.instance [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4a3166d7-3401-4240-94f1-f56e885f648e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.897 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <uuid>4a3166d7-3401-4240-94f1-f56e885f648e</uuid>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <name>instance-0000001e</name>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestGettingAddress-server-1299525305</nova:name>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:26:53</nova:creationTime>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:user uuid="23b8e0c173df4c2883fccd8cb472e427">tempest-TestGettingAddress-2054466537-project-member</nova:user>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:project uuid="b5f802fe6e0b4d62bba6143515207a40">tempest-TestGettingAddress-2054466537</nova:project>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:port uuid="af92c5bb-bdd8-47de-9e20-776300662aa0">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        <nova:port uuid="4a8384e7-5273-441f-aa76-37047262d85e">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe26:44f0" ipVersion="6"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <entry name="serial">4a3166d7-3401-4240-94f1-f56e885f648e</entry>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <entry name="uuid">4a3166d7-3401-4240-94f1-f56e885f648e</entry>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk.config"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:a9:48:99"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <target dev="tapaf92c5bb-bd"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:26:44:f0"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <target dev="tap4a8384e7-52"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/console.log" append="off"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:26:53 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:26:53 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:26:53 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:26:53 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.899 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Preparing to wait for external event network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.900 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.900 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.900 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.900 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Preparing to wait for external event network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.901 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.901 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.901 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.902 187256 DEBUG nova.virt.libvirt.vif [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:26:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1299525305',display_name='tempest-TestGettingAddress-server-1299525305',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1299525305',id=30,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG25bqRXMr3FU4IFmiWP4AKWa73QJj7Qe460t1LnUyUyJdk0XAT/L8jqfNUYoWQJfX3cLeLdAp3ZQ2H2sXeDrQd/UcTEwydPrQ/zJuzyzXV6hqrU4M+wB1dzDG7carbtrg==',key_name='tempest-TestGettingAddress-1714235428',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-ilkyg4d6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:26:41Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=4a3166d7-3401-4240-94f1-f56e885f648e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.902 187256 DEBUG nova.network.os_vif_util [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.903 187256 DEBUG nova.network.os_vif_util [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a9:48:99,bridge_name='br-int',has_traffic_filtering=True,id=af92c5bb-bdd8-47de-9e20-776300662aa0,network=Network(42968652-da0d-4f84-a781-24c009bf7324),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf92c5bb-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.903 187256 DEBUG os_vif [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:48:99,bridge_name='br-int',has_traffic_filtering=True,id=af92c5bb-bdd8-47de-9e20-776300662aa0,network=Network(42968652-da0d-4f84-a781-24c009bf7324),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf92c5bb-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.904 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.904 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.904 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.907 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.907 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf92c5bb-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.907 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaf92c5bb-bd, col_values=(('external_ids', {'iface-id': 'af92c5bb-bdd8-47de-9e20-776300662aa0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a9:48:99', 'vm-uuid': '4a3166d7-3401-4240-94f1-f56e885f648e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.909 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 NetworkManager[55548]: <info>  [1764347213.9103] manager: (tapaf92c5bb-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.911 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.921 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.925 187256 INFO os_vif [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a9:48:99,bridge_name='br-int',has_traffic_filtering=True,id=af92c5bb-bdd8-47de-9e20-776300662aa0,network=Network(42968652-da0d-4f84-a781-24c009bf7324),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf92c5bb-bd')#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.926 187256 DEBUG nova.virt.libvirt.vif [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:26:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1299525305',display_name='tempest-TestGettingAddress-server-1299525305',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1299525305',id=30,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG25bqRXMr3FU4IFmiWP4AKWa73QJj7Qe460t1LnUyUyJdk0XAT/L8jqfNUYoWQJfX3cLeLdAp3ZQ2H2sXeDrQd/UcTEwydPrQ/zJuzyzXV6hqrU4M+wB1dzDG7carbtrg==',key_name='tempest-TestGettingAddress-1714235428',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-ilkyg4d6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:26:41Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=4a3166d7-3401-4240-94f1-f56e885f648e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.927 187256 DEBUG nova.network.os_vif_util [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.928 187256 DEBUG nova.network.os_vif_util [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:44:f0,bridge_name='br-int',has_traffic_filtering=True,id=4a8384e7-5273-441f-aa76-37047262d85e,network=Network(3052edd0-a739-4b9a-8e70-10ed6d70aba3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a8384e7-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.929 187256 DEBUG os_vif [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:44:f0,bridge_name='br-int',has_traffic_filtering=True,id=4a8384e7-5273-441f-aa76-37047262d85e,network=Network(3052edd0-a739-4b9a-8e70-10ed6d70aba3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a8384e7-52') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.930 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.930 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.931 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.934 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.935 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4a8384e7-52, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.936 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4a8384e7-52, col_values=(('external_ids', {'iface-id': '4a8384e7-5273-441f-aa76-37047262d85e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:26:44:f0', 'vm-uuid': '4a3166d7-3401-4240-94f1-f56e885f648e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.938 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 NetworkManager[55548]: <info>  [1764347213.9393] manager: (tap4a8384e7-52): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.942 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.948 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:53 np0005538960 nova_compute[187252]: 2025-11-28 16:26:53.949 187256 INFO os_vif [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:44:f0,bridge_name='br-int',has_traffic_filtering=True,id=4a8384e7-5273-441f-aa76-37047262d85e,network=Network(3052edd0-a739-4b9a-8e70-10ed6d70aba3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a8384e7-52')#033[00m
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.021 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.023 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.024 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No VIF found with MAC fa:16:3e:a9:48:99, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.025 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No VIF found with MAC fa:16:3e:26:44:f0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.026 187256 INFO nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Using config drive#033[00m
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.783 187256 INFO nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Creating config drive at /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk.config#033[00m
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.789 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph33_mmtk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.917 187256 DEBUG oslo_concurrency.processutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph33_mmtk" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:26:54 np0005538960 NetworkManager[55548]: <info>  [1764347214.9891] manager: (tapaf92c5bb-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/75)
Nov 28 11:26:54 np0005538960 kernel: tapaf92c5bb-bd: entered promiscuous mode
Nov 28 11:26:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:54Z|00125|binding|INFO|Claiming lport af92c5bb-bdd8-47de-9e20-776300662aa0 for this chassis.
Nov 28 11:26:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:54Z|00126|binding|INFO|af92c5bb-bdd8-47de-9e20-776300662aa0: Claiming fa:16:3e:a9:48:99 10.100.0.11
Nov 28 11:26:54 np0005538960 nova_compute[187252]: 2025-11-28 16:26:54.993 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.004 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:48:99 10.100.0.11'], port_security=['fa:16:3e:a9:48:99 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4a3166d7-3401-4240-94f1-f56e885f648e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42968652-da0d-4f84-a781-24c009bf7324', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd295b2f9-6c9c-4736-955f-6e49c8f210cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f9bcb89-9d40-426c-aef2-9e1f3ea8fc79, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=af92c5bb-bdd8-47de-9e20-776300662aa0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.005 104369 INFO neutron.agent.ovn.metadata.agent [-] Port af92c5bb-bdd8-47de-9e20-776300662aa0 in datapath 42968652-da0d-4f84-a781-24c009bf7324 bound to our chassis#033[00m
Nov 28 11:26:55 np0005538960 NetworkManager[55548]: <info>  [1764347215.0064] manager: (tap4a8384e7-52): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.007 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42968652-da0d-4f84-a781-24c009bf7324#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.020 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f57121be-17cd-4f51-bee3-8c1da98e725e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.021 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42968652-d1 in ovnmeta-42968652-da0d-4f84-a781-24c009bf7324 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.023 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42968652-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.023 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[13b8015d-7f95-4884-b62f-86e9abbc65d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.024 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[48eeda41-f4d9-491b-9c28-23c8ac116d3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 systemd-udevd[218946]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:26:55 np0005538960 kernel: tap4a8384e7-52: entered promiscuous mode
Nov 28 11:26:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:55Z|00127|binding|INFO|Setting lport af92c5bb-bdd8-47de-9e20-776300662aa0 ovn-installed in OVS
Nov 28 11:26:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:55Z|00128|binding|INFO|Setting lport af92c5bb-bdd8-47de-9e20-776300662aa0 up in Southbound
Nov 28 11:26:55 np0005538960 systemd-udevd[218947]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.034 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.038 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[1646a7b5-2d48-448d-a32e-4cbecebbb05a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:55Z|00129|binding|INFO|Claiming lport 4a8384e7-5273-441f-aa76-37047262d85e for this chassis.
Nov 28 11:26:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:55Z|00130|binding|INFO|4a8384e7-5273-441f-aa76-37047262d85e: Claiming fa:16:3e:26:44:f0 2001:db8::f816:3eff:fe26:44f0
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.046 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:44:f0 2001:db8::f816:3eff:fe26:44f0'], port_security=['fa:16:3e:26:44:f0 2001:db8::f816:3eff:fe26:44f0'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe26:44f0/64', 'neutron:device_id': '4a3166d7-3401-4240-94f1-f56e885f648e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3052edd0-a739-4b9a-8e70-10ed6d70aba3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd295b2f9-6c9c-4736-955f-6e49c8f210cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18498b44-2b4d-4a21-814c-0acc5cfb3b2f, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=4a8384e7-5273-441f-aa76-37047262d85e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:26:55 np0005538960 NetworkManager[55548]: <info>  [1764347215.0495] device (tapaf92c5bb-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:26:55 np0005538960 NetworkManager[55548]: <info>  [1764347215.0508] device (tapaf92c5bb-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:26:55 np0005538960 NetworkManager[55548]: <info>  [1764347215.0524] device (tap4a8384e7-52): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:26:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:55Z|00131|binding|INFO|Setting lport 4a8384e7-5273-441f-aa76-37047262d85e ovn-installed in OVS
Nov 28 11:26:55 np0005538960 NetworkManager[55548]: <info>  [1764347215.0539] device (tap4a8384e7-52): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:26:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:55Z|00132|binding|INFO|Setting lport 4a8384e7-5273-441f-aa76-37047262d85e up in Southbound
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.054 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.068 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3c52f368-ed9e-4c20-97d7-fc27068bf41e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 systemd-machined[153518]: New machine qemu-10-instance-0000001e.
Nov 28 11:26:55 np0005538960 systemd[1]: Started Virtual Machine qemu-10-instance-0000001e.
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.105 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[14d3b70e-7a58-4d4d-9ca5-625319db55a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.111 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a1f50531-38f1-4bea-ae08-4ff10907c237]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 NetworkManager[55548]: <info>  [1764347215.1128] manager: (tap42968652-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.151 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[79b6293a-1d9a-4efb-bb53-9f481e168bc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.154 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[825429ca-c2f3-44fa-aa15-ca0fd41cb74a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 NetworkManager[55548]: <info>  [1764347215.1844] device (tap42968652-d0): carrier: link connected
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.191 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[33959e1d-b0eb-4ceb-b9be-0418c61bf5b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.213 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[921c1cb8-8332-406c-b43a-c00195172059]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42968652-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:47:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431476, 'reachable_time': 20557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218982, 'error': None, 'target': 'ovnmeta-42968652-da0d-4f84-a781-24c009bf7324', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.230 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6ec17587-0b96-4f86-bf06-9c702513afd9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee8:4757'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431476, 'tstamp': 431476}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218983, 'error': None, 'target': 'ovnmeta-42968652-da0d-4f84-a781-24c009bf7324', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.251 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[0477f4ed-bec5-4ea9-8d9d-53a36f371a8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42968652-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e8:47:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 40], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431476, 'reachable_time': 20557, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218984, 'error': None, 'target': 'ovnmeta-42968652-da0d-4f84-a781-24c009bf7324', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.285 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb735b8-cd6d-4c1b-a175-8c84907805b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.354 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5d7032a3-c258-4792-be23-0be85c461a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.356 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42968652-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.357 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.357 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42968652-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.359 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:55 np0005538960 NetworkManager[55548]: <info>  [1764347215.3603] manager: (tap42968652-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Nov 28 11:26:55 np0005538960 kernel: tap42968652-d0: entered promiscuous mode
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.362 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42968652-d0, col_values=(('external_ids', {'iface-id': 'd4d756c4-51bd-453d-913e-af78cee63393'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.363 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:55Z|00133|binding|INFO|Releasing lport d4d756c4-51bd-453d-913e-af78cee63393 from this chassis (sb_readonly=0)
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.364 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.366 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42968652-da0d-4f84-a781-24c009bf7324.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42968652-da0d-4f84-a781-24c009bf7324.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.367 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e56c1c62-b14f-475a-bb30-c4e2cb28f817]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.368 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-42968652-da0d-4f84-a781-24c009bf7324
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/42968652-da0d-4f84-a781-24c009bf7324.pid.haproxy
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID 42968652-da0d-4f84-a781-24c009bf7324
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.368 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42968652-da0d-4f84-a781-24c009bf7324', 'env', 'PROCESS_TAG=haproxy-42968652-da0d-4f84-a781-24c009bf7324', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42968652-da0d-4f84-a781-24c009bf7324.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.378 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.529 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347215.5284083, 4a3166d7-3401-4240-94f1-f56e885f648e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.530 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] VM Started (Lifecycle Event)#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.559 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.570 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347215.5289507, 4a3166d7-3401-4240-94f1-f56e885f648e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.571 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.597 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.600 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:26:55 np0005538960 nova_compute[187252]: 2025-11-28 16:26:55.625 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:26:55 np0005538960 podman[219024]: 2025-11-28 16:26:55.764278409 +0000 UTC m=+0.056079198 container create 53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 11:26:55 np0005538960 systemd[1]: Started libpod-conmon-53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525.scope.
Nov 28 11:26:55 np0005538960 podman[219024]: 2025-11-28 16:26:55.727299177 +0000 UTC m=+0.019099976 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:26:55 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:26:55 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1709eee38be9cde8fdc3f6994f55f7a74b8032f7b649228be5848b642ef6324c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:26:55 np0005538960 podman[219024]: 2025-11-28 16:26:55.88619585 +0000 UTC m=+0.177996639 container init 53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:26:55 np0005538960 podman[219024]: 2025-11-28 16:26:55.89194591 +0000 UTC m=+0.183746679 container start 53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 11:26:55 np0005538960 neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324[219039]: [NOTICE]   (219043) : New worker (219045) forked
Nov 28 11:26:55 np0005538960 neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324[219039]: [NOTICE]   (219043) : Loading success.
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.964 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 4a8384e7-5273-441f-aa76-37047262d85e in datapath 3052edd0-a739-4b9a-8e70-10ed6d70aba3 unbound from our chassis#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.966 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3052edd0-a739-4b9a-8e70-10ed6d70aba3#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.980 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7d2feec4-4f79-43e6-8e35-2f24feaf3e27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.981 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3052edd0-a1 in ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.983 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3052edd0-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.983 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[901985b4-2be1-4df7-b633-2ec670bb2b79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.984 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[23a2ca95-576a-4d28-ae4b-5339680f9d92]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:55.995 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[65663ab9-7e89-42ee-a7cc-a27861d198d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.010 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[4f51cb82-c4b5-4736-a5c2-cf2c01c6a704]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.036 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[770a68f7-df00-450c-a6b6-8d5a4cf24f17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 NetworkManager[55548]: <info>  [1764347216.0431] manager: (tap3052edd0-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.042 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[0b118683-3089-4bb7-b0de-5d3b77573c3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 systemd-udevd[218973]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.072 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[7fc5d5a9-565e-4b08-9173-fd290c721407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.074 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[bd771c55-9280-4b85-9245-ce119ffe37f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 NetworkManager[55548]: <info>  [1764347216.0971] device (tap3052edd0-a0): carrier: link connected
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.104 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[d3752307-86a0-4c67-9734-f75d5d4bb690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.123 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[795ee0e5-326c-41c4-a3ae-798d25feaf49]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3052edd0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:2c:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431567, 'reachable_time': 23490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219064, 'error': None, 'target': 'ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.140 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c37df24a-8196-45db-9870-8058a3f1243f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe14:2c21'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 431567, 'tstamp': 431567}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219065, 'error': None, 'target': 'ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.163 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[183ff30a-955b-4bef-8f53-dcb1fcf58d3b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3052edd0-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:14:2c:21'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431567, 'reachable_time': 23490, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219066, 'error': None, 'target': 'ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.184 187256 DEBUG nova.compute.manager [req-32b5fe51-47ec-47f0-8f51-66dd50e675e1 req-4bc66b50-9ad3-404e-8171-968e1bb9526c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.185 187256 DEBUG oslo_concurrency.lockutils [req-32b5fe51-47ec-47f0-8f51-66dd50e675e1 req-4bc66b50-9ad3-404e-8171-968e1bb9526c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.185 187256 DEBUG oslo_concurrency.lockutils [req-32b5fe51-47ec-47f0-8f51-66dd50e675e1 req-4bc66b50-9ad3-404e-8171-968e1bb9526c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.185 187256 DEBUG oslo_concurrency.lockutils [req-32b5fe51-47ec-47f0-8f51-66dd50e675e1 req-4bc66b50-9ad3-404e-8171-968e1bb9526c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.186 187256 DEBUG nova.compute.manager [req-32b5fe51-47ec-47f0-8f51-66dd50e675e1 req-4bc66b50-9ad3-404e-8171-968e1bb9526c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Processing event network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.196 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a7b8dd42-d9d3-4b1f-bdd0-0c49dba9cf22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.228 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6903b8fe-0553-4e10-93fe-b8b7bae3c52b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.230 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3052edd0-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.230 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.230 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3052edd0-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.232 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:56 np0005538960 NetworkManager[55548]: <info>  [1764347216.2335] manager: (tap3052edd0-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Nov 28 11:26:56 np0005538960 kernel: tap3052edd0-a0: entered promiscuous mode
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.235 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3052edd0-a0, col_values=(('external_ids', {'iface-id': '1f0021f0-eb60-4d98-8ef4-9ff16fc818e5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:26:56 np0005538960 ovn_controller[95460]: 2025-11-28T16:26:56Z|00134|binding|INFO|Releasing lport 1f0021f0-eb60-4d98-8ef4-9ff16fc818e5 from this chassis (sb_readonly=0)
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.236 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.251 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.252 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3052edd0-a739-4b9a-8e70-10ed6d70aba3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3052edd0-a739-4b9a-8e70-10ed6d70aba3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:26:56 np0005538960 nova_compute[187252]: 2025-11-28 16:26:56.253 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.254 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5626262d-8b2c-44d3-af44-9ffb5084b662]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.255 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-3052edd0-a739-4b9a-8e70-10ed6d70aba3
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/3052edd0-a739-4b9a-8e70-10ed6d70aba3.pid.haproxy
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID 3052edd0-a739-4b9a-8e70-10ed6d70aba3
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:26:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:26:56.256 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3', 'env', 'PROCESS_TAG=haproxy-3052edd0-a739-4b9a-8e70-10ed6d70aba3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3052edd0-a739-4b9a-8e70-10ed6d70aba3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:26:56 np0005538960 podman[219096]: 2025-11-28 16:26:56.630242602 +0000 UTC m=+0.053748550 container create 0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:26:56 np0005538960 systemd[1]: Started libpod-conmon-0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7.scope.
Nov 28 11:26:56 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:26:56 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f12ee0dd95d9ff24c2915c0121b82d91a1a835366d8be1bddbb76f07020758f7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:26:56 np0005538960 podman[219096]: 2025-11-28 16:26:56.690004989 +0000 UTC m=+0.113510957 container init 0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:26:56 np0005538960 podman[219096]: 2025-11-28 16:26:56.599950003 +0000 UTC m=+0.023455981 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:26:56 np0005538960 podman[219096]: 2025-11-28 16:26:56.697703426 +0000 UTC m=+0.121209374 container start 0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:26:56 np0005538960 neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3[219113]: [NOTICE]   (219126) : New worker (219133) forked
Nov 28 11:26:56 np0005538960 neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3[219113]: [NOTICE]   (219126) : Loading success.
Nov 28 11:26:56 np0005538960 podman[219110]: 2025-11-28 16:26:56.744086176 +0000 UTC m=+0.077039248 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 11:26:57 np0005538960 nova_compute[187252]: 2025-11-28 16:26:57.367 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:57 np0005538960 nova_compute[187252]: 2025-11-28 16:26:57.389 187256 DEBUG nova.network.neutron [req-1b5d1bac-e3e8-4c49-afac-be8c229129c2 req-08392d48-0fa7-4d3d-965d-36ae125e3be1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updated VIF entry in instance network info cache for port 4a8384e7-5273-441f-aa76-37047262d85e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:26:57 np0005538960 nova_compute[187252]: 2025-11-28 16:26:57.389 187256 DEBUG nova.network.neutron [req-1b5d1bac-e3e8-4c49-afac-be8c229129c2 req-08392d48-0fa7-4d3d-965d-36ae125e3be1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updating instance_info_cache with network_info: [{"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:26:57 np0005538960 nova_compute[187252]: 2025-11-28 16:26:57.417 187256 DEBUG oslo_concurrency.lockutils [req-1b5d1bac-e3e8-4c49-afac-be8c229129c2 req-08392d48-0fa7-4d3d-965d-36ae125e3be1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:26:57 np0005538960 nova_compute[187252]: 2025-11-28 16:26:57.994 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.293 187256 DEBUG nova.compute.manager [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.294 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.294 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.294 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.294 187256 DEBUG nova.compute.manager [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] No event matching network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 in dict_keys([('network-vif-plugged', '4a8384e7-5273-441f-aa76-37047262d85e')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.295 187256 WARNING nova.compute.manager [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received unexpected event network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 for instance with vm_state building and task_state spawning.#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.295 187256 DEBUG nova.compute.manager [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.295 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.295 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.295 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.296 187256 DEBUG nova.compute.manager [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Processing event network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.296 187256 DEBUG nova.compute.manager [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.296 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.296 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.296 187256 DEBUG oslo_concurrency.lockutils [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.297 187256 DEBUG nova.compute.manager [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] No waiting events found dispatching network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.297 187256 WARNING nova.compute.manager [req-d72fc0a8-0a5d-48ef-b467-096837d28e2c req-4ec0b9c0-38bf-4bf2-867d-973b86de45de 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received unexpected event network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e for instance with vm_state building and task_state spawning.#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.297 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Instance event wait completed in 2 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.303 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347218.3028862, 4a3166d7-3401-4240-94f1-f56e885f648e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.303 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.306 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.311 187256 INFO nova.virt.libvirt.driver [-] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Instance spawned successfully.#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.312 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.339 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.347 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.351 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.352 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.352 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.353 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.353 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.354 187256 DEBUG nova.virt.libvirt.driver [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.388 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.440 187256 INFO nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Took 17.00 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.440 187256 DEBUG nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.648 187256 INFO nova.compute.manager [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Took 18.13 seconds to build instance.#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.675 187256 DEBUG oslo_concurrency.lockutils [None req-bc88c03c-efee-4851-b656-edd9e6b92c07 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:26:58 np0005538960 nova_compute[187252]: 2025-11-28 16:26:58.940 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:00 np0005538960 podman[219145]: 2025-11-28 16:27:00.176012524 +0000 UTC m=+0.083314202 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:27:02 np0005538960 nova_compute[187252]: 2025-11-28 16:27:02.216 187256 DEBUG nova.compute.manager [req-80401c2d-e03e-40ad-9d01-2401faa435e1 req-62487b5f-f676-40d5-bb9e-fe6544c04e92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-changed-af92c5bb-bdd8-47de-9e20-776300662aa0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:02 np0005538960 nova_compute[187252]: 2025-11-28 16:27:02.217 187256 DEBUG nova.compute.manager [req-80401c2d-e03e-40ad-9d01-2401faa435e1 req-62487b5f-f676-40d5-bb9e-fe6544c04e92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Refreshing instance network info cache due to event network-changed-af92c5bb-bdd8-47de-9e20-776300662aa0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:27:02 np0005538960 nova_compute[187252]: 2025-11-28 16:27:02.217 187256 DEBUG oslo_concurrency.lockutils [req-80401c2d-e03e-40ad-9d01-2401faa435e1 req-62487b5f-f676-40d5-bb9e-fe6544c04e92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:27:02 np0005538960 nova_compute[187252]: 2025-11-28 16:27:02.217 187256 DEBUG oslo_concurrency.lockutils [req-80401c2d-e03e-40ad-9d01-2401faa435e1 req-62487b5f-f676-40d5-bb9e-fe6544c04e92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:27:02 np0005538960 nova_compute[187252]: 2025-11-28 16:27:02.217 187256 DEBUG nova.network.neutron [req-80401c2d-e03e-40ad-9d01-2401faa435e1 req-62487b5f-f676-40d5-bb9e-fe6544c04e92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Refreshing network info cache for port af92c5bb-bdd8-47de-9e20-776300662aa0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:27:02 np0005538960 nova_compute[187252]: 2025-11-28 16:27:02.998 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:03 np0005538960 nova_compute[187252]: 2025-11-28 16:27:03.889 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:03 np0005538960 nova_compute[187252]: 2025-11-28 16:27:03.943 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:04 np0005538960 nova_compute[187252]: 2025-11-28 16:27:04.461 187256 DEBUG nova.network.neutron [req-80401c2d-e03e-40ad-9d01-2401faa435e1 req-62487b5f-f676-40d5-bb9e-fe6544c04e92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updated VIF entry in instance network info cache for port af92c5bb-bdd8-47de-9e20-776300662aa0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:27:04 np0005538960 nova_compute[187252]: 2025-11-28 16:27:04.462 187256 DEBUG nova.network.neutron [req-80401c2d-e03e-40ad-9d01-2401faa435e1 req-62487b5f-f676-40d5-bb9e-fe6544c04e92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updating instance_info_cache with network_info: [{"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:04 np0005538960 nova_compute[187252]: 2025-11-28 16:27:04.484 187256 DEBUG oslo_concurrency.lockutils [req-80401c2d-e03e-40ad-9d01-2401faa435e1 req-62487b5f-f676-40d5-bb9e-fe6544c04e92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:27:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:06.347 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:06.347 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:06.348 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:06 np0005538960 podman[219168]: 2025-11-28 16:27:06.467307713 +0000 UTC m=+0.089729418 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 11:27:08 np0005538960 nova_compute[187252]: 2025-11-28 16:27:08.000 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:08 np0005538960 nova_compute[187252]: 2025-11-28 16:27:08.946 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:09.571 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:27:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:09.573 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:27:09 np0005538960 nova_compute[187252]: 2025-11-28 16:27:09.573 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:10 np0005538960 podman[219196]: 2025-11-28 16:27:10.178027978 +0000 UTC m=+0.067482679 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent)
Nov 28 11:27:10 np0005538960 podman[219195]: 2025-11-28 16:27:10.178387466 +0000 UTC m=+0.070256842 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:27:11 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:11Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a9:48:99 10.100.0.11
Nov 28 11:27:11 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:11Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a9:48:99 10.100.0.11
Nov 28 11:27:13 np0005538960 nova_compute[187252]: 2025-11-28 16:27:13.004 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:13 np0005538960 nova_compute[187252]: 2025-11-28 16:27:13.081 187256 DEBUG nova.compute.manager [req-e2a2d402-5009-40b8-8199-77a856fe5eb7 req-d739539a-fd5b-482d-b485-a7c66856fea2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-changed-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:13 np0005538960 nova_compute[187252]: 2025-11-28 16:27:13.082 187256 DEBUG nova.compute.manager [req-e2a2d402-5009-40b8-8199-77a856fe5eb7 req-d739539a-fd5b-482d-b485-a7c66856fea2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing instance network info cache due to event network-changed-c5a5eead-793d-43f8-8cc3-a792eb3d80f8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:27:13 np0005538960 nova_compute[187252]: 2025-11-28 16:27:13.082 187256 DEBUG oslo_concurrency.lockutils [req-e2a2d402-5009-40b8-8199-77a856fe5eb7 req-d739539a-fd5b-482d-b485-a7c66856fea2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:27:13 np0005538960 nova_compute[187252]: 2025-11-28 16:27:13.082 187256 DEBUG oslo_concurrency.lockutils [req-e2a2d402-5009-40b8-8199-77a856fe5eb7 req-d739539a-fd5b-482d-b485-a7c66856fea2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:27:13 np0005538960 nova_compute[187252]: 2025-11-28 16:27:13.082 187256 DEBUG nova.network.neutron [req-e2a2d402-5009-40b8-8199-77a856fe5eb7 req-d739539a-fd5b-482d-b485-a7c66856fea2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing network info cache for port c5a5eead-793d-43f8-8cc3-a792eb3d80f8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:27:13 np0005538960 nova_compute[187252]: 2025-11-28 16:27:13.949 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:15 np0005538960 podman[219250]: 2025-11-28 16:27:15.152715822 +0000 UTC m=+0.052188273 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:27:15 np0005538960 nova_compute[187252]: 2025-11-28 16:27:15.363 187256 DEBUG nova.network.neutron [req-e2a2d402-5009-40b8-8199-77a856fe5eb7 req-d739539a-fd5b-482d-b485-a7c66856fea2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updated VIF entry in instance network info cache for port c5a5eead-793d-43f8-8cc3-a792eb3d80f8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:27:15 np0005538960 nova_compute[187252]: 2025-11-28 16:27:15.363 187256 DEBUG nova.network.neutron [req-e2a2d402-5009-40b8-8199-77a856fe5eb7 req-d739539a-fd5b-482d-b485-a7c66856fea2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:15 np0005538960 nova_compute[187252]: 2025-11-28 16:27:15.383 187256 DEBUG oslo_concurrency.lockutils [req-e2a2d402-5009-40b8-8199-77a856fe5eb7 req-d739539a-fd5b-482d-b485-a7c66856fea2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:27:16 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:16.575 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:18 np0005538960 nova_compute[187252]: 2025-11-28 16:27:18.007 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:18 np0005538960 nova_compute[187252]: 2025-11-28 16:27:18.953 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:19 np0005538960 podman[219276]: 2025-11-28 16:27:19.174482757 +0000 UTC m=+0.067686715 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=)
Nov 28 11:27:19 np0005538960 nova_compute[187252]: 2025-11-28 16:27:19.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:19 np0005538960 nova_compute[187252]: 2025-11-28 16:27:19.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.621 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.622 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.622 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.623 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.623 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.624 187256 INFO nova.compute.manager [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Terminating instance#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.625 187256 DEBUG nova.compute.manager [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:27:20 np0005538960 kernel: tapaf92c5bb-bd (unregistering): left promiscuous mode
Nov 28 11:27:20 np0005538960 NetworkManager[55548]: <info>  [1764347240.6561] device (tapaf92c5bb-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:27:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:20Z|00135|binding|INFO|Releasing lport af92c5bb-bdd8-47de-9e20-776300662aa0 from this chassis (sb_readonly=0)
Nov 28 11:27:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:20Z|00136|binding|INFO|Setting lport af92c5bb-bdd8-47de-9e20-776300662aa0 down in Southbound
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.665 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:20Z|00137|binding|INFO|Removing iface tapaf92c5bb-bd ovn-installed in OVS
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.667 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:20.680 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a9:48:99 10.100.0.11'], port_security=['fa:16:3e:a9:48:99 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4a3166d7-3401-4240-94f1-f56e885f648e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42968652-da0d-4f84-a781-24c009bf7324', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd295b2f9-6c9c-4736-955f-6e49c8f210cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f9bcb89-9d40-426c-aef2-9e1f3ea8fc79, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=af92c5bb-bdd8-47de-9e20-776300662aa0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:27:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:20.681 104369 INFO neutron.agent.ovn.metadata.agent [-] Port af92c5bb-bdd8-47de-9e20-776300662aa0 in datapath 42968652-da0d-4f84-a781-24c009bf7324 unbound from our chassis#033[00m
Nov 28 11:27:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:20.683 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42968652-da0d-4f84-a781-24c009bf7324, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:27:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:20.684 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f93bfc0e-2523-4c55-ac6c-d7ab4c824ef8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:20.686 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42968652-da0d-4f84-a781-24c009bf7324 namespace which is not needed anymore#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.687 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 kernel: tap4a8384e7-52 (unregistering): left promiscuous mode
Nov 28 11:27:20 np0005538960 NetworkManager[55548]: <info>  [1764347240.7037] device (tap4a8384e7-52): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.722 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:20Z|00138|binding|INFO|Releasing lport 4a8384e7-5273-441f-aa76-37047262d85e from this chassis (sb_readonly=0)
Nov 28 11:27:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:20Z|00139|binding|INFO|Setting lport 4a8384e7-5273-441f-aa76-37047262d85e down in Southbound
Nov 28 11:27:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:20Z|00140|binding|INFO|Removing iface tap4a8384e7-52 ovn-installed in OVS
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.724 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.739 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Nov 28 11:27:20 np0005538960 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001e.scope: Consumed 14.186s CPU time.
Nov 28 11:27:20 np0005538960 systemd-machined[153518]: Machine qemu-10-instance-0000001e terminated.
Nov 28 11:27:20 np0005538960 neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324[219039]: [NOTICE]   (219043) : haproxy version is 2.8.14-c23fe91
Nov 28 11:27:20 np0005538960 neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324[219039]: [NOTICE]   (219043) : path to executable is /usr/sbin/haproxy
Nov 28 11:27:20 np0005538960 neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324[219039]: [WARNING]  (219043) : Exiting Master process...
Nov 28 11:27:20 np0005538960 neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324[219039]: [WARNING]  (219043) : Exiting Master process...
Nov 28 11:27:20 np0005538960 neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324[219039]: [ALERT]    (219043) : Current worker (219045) exited with code 143 (Terminated)
Nov 28 11:27:20 np0005538960 neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324[219039]: [WARNING]  (219043) : All workers exited. Exiting... (0)
Nov 28 11:27:20 np0005538960 systemd[1]: libpod-53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525.scope: Deactivated successfully.
Nov 28 11:27:20 np0005538960 podman[219326]: 2025-11-28 16:27:20.880835445 +0000 UTC m=+0.060134644 container died 53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:27:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:20.889 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:26:44:f0 2001:db8::f816:3eff:fe26:44f0'], port_security=['fa:16:3e:26:44:f0 2001:db8::f816:3eff:fe26:44f0'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe26:44f0/64', 'neutron:device_id': '4a3166d7-3401-4240-94f1-f56e885f648e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3052edd0-a739-4b9a-8e70-10ed6d70aba3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd295b2f9-6c9c-4736-955f-6e49c8f210cd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=18498b44-2b4d-4a21-814c-0acc5cfb3b2f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=4a8384e7-5273-441f-aa76-37047262d85e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.915 187256 INFO nova.virt.libvirt.driver [-] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Instance destroyed successfully.#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.916 187256 DEBUG nova.objects.instance [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'resources' on Instance uuid 4a3166d7-3401-4240-94f1-f56e885f648e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:27:20 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525-userdata-shm.mount: Deactivated successfully.
Nov 28 11:27:20 np0005538960 systemd[1]: var-lib-containers-storage-overlay-1709eee38be9cde8fdc3f6994f55f7a74b8032f7b649228be5848b642ef6324c-merged.mount: Deactivated successfully.
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.930 187256 DEBUG nova.virt.libvirt.vif [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:26:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1299525305',display_name='tempest-TestGettingAddress-server-1299525305',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1299525305',id=30,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG25bqRXMr3FU4IFmiWP4AKWa73QJj7Qe460t1LnUyUyJdk0XAT/L8jqfNUYoWQJfX3cLeLdAp3ZQ2H2sXeDrQd/UcTEwydPrQ/zJuzyzXV6hqrU4M+wB1dzDG7carbtrg==',key_name='tempest-TestGettingAddress-1714235428',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:26:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-ilkyg4d6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:26:58Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=4a3166d7-3401-4240-94f1-f56e885f648e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.930 187256 DEBUG nova.network.os_vif_util [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "af92c5bb-bdd8-47de-9e20-776300662aa0", "address": "fa:16:3e:a9:48:99", "network": {"id": "42968652-da0d-4f84-a781-24c009bf7324", "bridge": "br-int", "label": "tempest-network-smoke--36855790", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.242", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaf92c5bb-bd", "ovs_interfaceid": "af92c5bb-bdd8-47de-9e20-776300662aa0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.931 187256 DEBUG nova.network.os_vif_util [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a9:48:99,bridge_name='br-int',has_traffic_filtering=True,id=af92c5bb-bdd8-47de-9e20-776300662aa0,network=Network(42968652-da0d-4f84-a781-24c009bf7324),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf92c5bb-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.931 187256 DEBUG os_vif [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:48:99,bridge_name='br-int',has_traffic_filtering=True,id=af92c5bb-bdd8-47de-9e20-776300662aa0,network=Network(42968652-da0d-4f84-a781-24c009bf7324),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf92c5bb-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.933 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.933 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf92c5bb-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:20 np0005538960 podman[219326]: 2025-11-28 16:27:20.934860279 +0000 UTC m=+0.114159468 container cleanup 53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.936 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.939 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.941 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 systemd[1]: libpod-conmon-53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525.scope: Deactivated successfully.
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.946 187256 INFO os_vif [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a9:48:99,bridge_name='br-int',has_traffic_filtering=True,id=af92c5bb-bdd8-47de-9e20-776300662aa0,network=Network(42968652-da0d-4f84-a781-24c009bf7324),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaf92c5bb-bd')#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.947 187256 DEBUG nova.virt.libvirt.vif [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:26:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1299525305',display_name='tempest-TestGettingAddress-server-1299525305',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1299525305',id=30,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG25bqRXMr3FU4IFmiWP4AKWa73QJj7Qe460t1LnUyUyJdk0XAT/L8jqfNUYoWQJfX3cLeLdAp3ZQ2H2sXeDrQd/UcTEwydPrQ/zJuzyzXV6hqrU4M+wB1dzDG7carbtrg==',key_name='tempest-TestGettingAddress-1714235428',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:26:58Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-ilkyg4d6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:26:58Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=4a3166d7-3401-4240-94f1-f56e885f648e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.947 187256 DEBUG nova.network.os_vif_util [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.948 187256 DEBUG nova.network.os_vif_util [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:26:44:f0,bridge_name='br-int',has_traffic_filtering=True,id=4a8384e7-5273-441f-aa76-37047262d85e,network=Network(3052edd0-a739-4b9a-8e70-10ed6d70aba3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a8384e7-52') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.949 187256 DEBUG os_vif [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:44:f0,bridge_name='br-int',has_traffic_filtering=True,id=4a8384e7-5273-441f-aa76-37047262d85e,network=Network(3052edd0-a739-4b9a-8e70-10ed6d70aba3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a8384e7-52') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.950 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.950 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4a8384e7-52, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.952 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.953 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.953 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.955 187256 INFO os_vif [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:26:44:f0,bridge_name='br-int',has_traffic_filtering=True,id=4a8384e7-5273-441f-aa76-37047262d85e,network=Network(3052edd0-a739-4b9a-8e70-10ed6d70aba3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4a8384e7-52')#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.955 187256 INFO nova.virt.libvirt.driver [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Deleting instance files /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e_del#033[00m
Nov 28 11:27:20 np0005538960 nova_compute[187252]: 2025-11-28 16:27:20.956 187256 INFO nova.virt.libvirt.driver [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Deletion of /var/lib/nova/instances/4a3166d7-3401-4240-94f1-f56e885f648e_del complete#033[00m
Nov 28 11:27:21 np0005538960 podman[219384]: 2025-11-28 16:27:21.011897315 +0000 UTC m=+0.050288301 container remove 53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.017 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[0caa68fd-7ad5-4a2e-b5d2-e01fd87524ba]: (4, ('Fri Nov 28 04:27:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324 (53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525)\n53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525\nFri Nov 28 04:27:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-42968652-da0d-4f84-a781-24c009bf7324 (53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525)\n53dd2ae3d030565effc8b60e44cd5a8d8ab30911f59405d0bbe31e95130c4525\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.020 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3cee21cd-fcdc-4ab6-90bd-2b5a1436eda9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.021 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42968652-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.023 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:21 np0005538960 kernel: tap42968652-d0: left promiscuous mode
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.035 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.039 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b05265e6-601d-4826-bb37-fa0fc7bb1f81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.059 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9364d268-c999-47ce-813a-69083aba8206]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.061 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[147293f1-5c11-46ab-9fa0-8370f3f0b699]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.080 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[85de96c5-c19d-4ac1-a6fc-de4d296b0d51]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431467, 'reachable_time': 42696, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219400, 'error': None, 'target': 'ovnmeta-42968652-da0d-4f84-a781-24c009bf7324', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 systemd[1]: run-netns-ovnmeta\x2d42968652\x2dda0d\x2d4f84\x2da781\x2d24c009bf7324.mount: Deactivated successfully.
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.086 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42968652-da0d-4f84-a781-24c009bf7324 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.086 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[de723200-32ef-4b32-bac9-bbae260f4165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.087 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 4a8384e7-5273-441f-aa76-37047262d85e in datapath 3052edd0-a739-4b9a-8e70-10ed6d70aba3 unbound from our chassis#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.089 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3052edd0-a739-4b9a-8e70-10ed6d70aba3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.090 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[89fdf621-f212-42ca-8779-82a487146764]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.090 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3 namespace which is not needed anymore#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.198 187256 INFO nova.compute.manager [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Took 0.57 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.199 187256 DEBUG oslo.service.loopingcall [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.199 187256 DEBUG nova.compute.manager [-] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.200 187256 DEBUG nova.network.neutron [-] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:27:21 np0005538960 neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3[219113]: [NOTICE]   (219126) : haproxy version is 2.8.14-c23fe91
Nov 28 11:27:21 np0005538960 neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3[219113]: [NOTICE]   (219126) : path to executable is /usr/sbin/haproxy
Nov 28 11:27:21 np0005538960 neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3[219113]: [WARNING]  (219126) : Exiting Master process...
Nov 28 11:27:21 np0005538960 neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3[219113]: [ALERT]    (219126) : Current worker (219133) exited with code 143 (Terminated)
Nov 28 11:27:21 np0005538960 neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3[219113]: [WARNING]  (219126) : All workers exited. Exiting... (0)
Nov 28 11:27:21 np0005538960 systemd[1]: libpod-0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7.scope: Deactivated successfully.
Nov 28 11:27:21 np0005538960 podman[219418]: 2025-11-28 16:27:21.230139959 +0000 UTC m=+0.053116064 container died 0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:27:21 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7-userdata-shm.mount: Deactivated successfully.
Nov 28 11:27:21 np0005538960 systemd[1]: var-lib-containers-storage-overlay-f12ee0dd95d9ff24c2915c0121b82d91a1a835366d8be1bddbb76f07020758f7-merged.mount: Deactivated successfully.
Nov 28 11:27:21 np0005538960 podman[219418]: 2025-11-28 16:27:21.264981018 +0000 UTC m=+0.087957123 container cleanup 0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 11:27:21 np0005538960 systemd[1]: libpod-conmon-0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7.scope: Deactivated successfully.
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:21 np0005538960 podman[219446]: 2025-11-28 16:27:21.332758554 +0000 UTC m=+0.044446919 container remove 0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.338 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[41bf12b4-e24a-4132-9e80-bb96d02aeab9]: (4, ('Fri Nov 28 04:27:21 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3 (0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7)\n0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7\nFri Nov 28 04:27:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3 (0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7)\n0abd5a49355e2f9cb7db1fa8952eb93e896629189dad63bb470e2decb42144d7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.340 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[4748c183-6f91-4855-86ad-ed96c85e06f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.341 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3052edd0-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.343 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:21 np0005538960 kernel: tap3052edd0-a0: left promiscuous mode
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.348 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.349 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.349 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.349 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.354 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.361 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fa24a234-3c8c-4e0c-83f9-6a0292a80aed]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.373 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[68cad837-26da-4824-b2ff-fbb160325f70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.374 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c0af5cb8-a911-44c8-a97a-7403036753f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.391 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[511e3194-7bc3-4da0-88ca-4755cba1840f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 431561, 'reachable_time': 43687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219462, 'error': None, 'target': 'ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.393 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3052edd0-a739-4b9a-8e70-10ed6d70aba3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:27:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:21.394 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[3bcc7d07-619c-491f-9dcc-cb954cae8de1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.442 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.505 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.506 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.604 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.814 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.816 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5565MB free_disk=73.30842971801758GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.817 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.818 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.912 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance ecbea330-ccac-4a01-a80b-0c10a2f686e2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.913 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance 4a3166d7-3401-4240-94f1-f56e885f648e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.913 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.914 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:27:21 np0005538960 systemd[1]: run-netns-ovnmeta\x2d3052edd0\x2da739\x2d4b9a\x2d8e70\x2d10ed6d70aba3.mount: Deactivated successfully.
Nov 28 11:27:21 np0005538960 nova_compute[187252]: 2025-11-28 16:27:21.992 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.006 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.024 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.024 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.396 187256 DEBUG nova.compute.manager [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-changed-af92c5bb-bdd8-47de-9e20-776300662aa0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.396 187256 DEBUG nova.compute.manager [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Refreshing instance network info cache due to event network-changed-af92c5bb-bdd8-47de-9e20-776300662aa0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.397 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.397 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.397 187256 DEBUG nova.network.neutron [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Refreshing network info cache for port af92c5bb-bdd8-47de-9e20-776300662aa0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.426 187256 DEBUG nova.compute.manager [req-fd8696ea-ad6c-4d72-9636-bb0a2a1a9fff req-1e143169-9d0b-48fd-ac18-a9654fa95d70 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-deleted-af92c5bb-bdd8-47de-9e20-776300662aa0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.426 187256 INFO nova.compute.manager [req-fd8696ea-ad6c-4d72-9636-bb0a2a1a9fff req-1e143169-9d0b-48fd-ac18-a9654fa95d70 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Neutron deleted interface af92c5bb-bdd8-47de-9e20-776300662aa0; detaching it from the instance and deleting it from the info cache#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.427 187256 DEBUG nova.network.neutron [req-fd8696ea-ad6c-4d72-9636-bb0a2a1a9fff req-1e143169-9d0b-48fd-ac18-a9654fa95d70 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updating instance_info_cache with network_info: [{"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.451 187256 DEBUG nova.compute.manager [req-fd8696ea-ad6c-4d72-9636-bb0a2a1a9fff req-1e143169-9d0b-48fd-ac18-a9654fa95d70 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Detach interface failed, port_id=af92c5bb-bdd8-47de-9e20-776300662aa0, reason: Instance 4a3166d7-3401-4240-94f1-f56e885f648e could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.711 187256 INFO nova.network.neutron [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Port af92c5bb-bdd8-47de-9e20-776300662aa0 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.712 187256 DEBUG nova.network.neutron [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updating instance_info_cache with network_info: [{"id": "4a8384e7-5273-441f-aa76-37047262d85e", "address": "fa:16:3e:26:44:f0", "network": {"id": "3052edd0-a739-4b9a-8e70-10ed6d70aba3", "bridge": "br-int", "label": "tempest-network-smoke--813939374", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe26:44f0", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4a8384e7-52", "ovs_interfaceid": "4a8384e7-5273-441f-aa76-37047262d85e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.733 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-4a3166d7-3401-4240-94f1-f56e885f648e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.734 187256 DEBUG nova.compute.manager [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-unplugged-af92c5bb-bdd8-47de-9e20-776300662aa0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.734 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.735 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.735 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.735 187256 DEBUG nova.compute.manager [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] No waiting events found dispatching network-vif-unplugged-af92c5bb-bdd8-47de-9e20-776300662aa0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.736 187256 DEBUG nova.compute.manager [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-unplugged-af92c5bb-bdd8-47de-9e20-776300662aa0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.736 187256 DEBUG nova.compute.manager [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.736 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.737 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.737 187256 DEBUG oslo_concurrency.lockutils [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.738 187256 DEBUG nova.compute.manager [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] No waiting events found dispatching network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.738 187256 WARNING nova.compute.manager [req-8d3a1034-bfd3-4761-ac69-3050ce6c041f req-c9c26c3a-3699-454a-b9b4-b917fb53e82e 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received unexpected event network-vif-plugged-af92c5bb-bdd8-47de-9e20-776300662aa0 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.861 187256 DEBUG nova.network.neutron [-] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.877 187256 INFO nova.compute.manager [-] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Took 1.68 seconds to deallocate network for instance.#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.927 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:22 np0005538960 nova_compute[187252]: 2025-11-28 16:27:22.928 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.004 187256 DEBUG nova.compute.provider_tree [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.010 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.019 187256 DEBUG nova.scheduler.client.report [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.023 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.023 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.024 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.047 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.075 187256 INFO nova.scheduler.client.report [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Deleted allocations for instance 4a3166d7-3401-4240-94f1-f56e885f648e#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.165 187256 DEBUG oslo_concurrency.lockutils [None req-bfbac4d2-4f9e-49c8-a65b-f4f1c6fcd466 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.544s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.748 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.748 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.749 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:27:23 np0005538960 nova_compute[187252]: 2025-11-28 16:27:23.749 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.532 187256 DEBUG nova.compute.manager [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-unplugged-4a8384e7-5273-441f-aa76-37047262d85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.532 187256 DEBUG oslo_concurrency.lockutils [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.532 187256 DEBUG oslo_concurrency.lockutils [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.532 187256 DEBUG oslo_concurrency.lockutils [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.533 187256 DEBUG nova.compute.manager [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] No waiting events found dispatching network-vif-unplugged-4a8384e7-5273-441f-aa76-37047262d85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.533 187256 WARNING nova.compute.manager [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received unexpected event network-vif-unplugged-4a8384e7-5273-441f-aa76-37047262d85e for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.533 187256 DEBUG nova.compute.manager [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.533 187256 DEBUG oslo_concurrency.lockutils [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.533 187256 DEBUG oslo_concurrency.lockutils [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.534 187256 DEBUG oslo_concurrency.lockutils [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "4a3166d7-3401-4240-94f1-f56e885f648e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.534 187256 DEBUG nova.compute.manager [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] No waiting events found dispatching network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.534 187256 WARNING nova.compute.manager [req-6c171653-7330-45d6-a653-ffa5843204e5 req-d12669d1-7858-4c9c-8f6e-2fd1a2f62fa6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received unexpected event network-vif-plugged-4a8384e7-5273-441f-aa76-37047262d85e for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:27:24 np0005538960 nova_compute[187252]: 2025-11-28 16:27:24.545 187256 DEBUG nova.compute.manager [req-f87335f1-9138-474f-81b3-1fbf6c987cba req-4bb0115e-20e7-49bc-9f8e-362a397a620f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Received event network-vif-deleted-4a8384e7-5273-441f-aa76-37047262d85e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:25 np0005538960 nova_compute[187252]: 2025-11-28 16:27:25.953 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:26 np0005538960 nova_compute[187252]: 2025-11-28 16:27:26.730 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:26 np0005538960 nova_compute[187252]: 2025-11-28 16:27:26.749 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:27:26 np0005538960 nova_compute[187252]: 2025-11-28 16:27:26.750 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:27:26 np0005538960 nova_compute[187252]: 2025-11-28 16:27:26.750 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:26 np0005538960 nova_compute[187252]: 2025-11-28 16:27:26.751 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:26 np0005538960 nova_compute[187252]: 2025-11-28 16:27:26.751 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:26 np0005538960 nova_compute[187252]: 2025-11-28 16:27:26.751 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:27 np0005538960 nova_compute[187252]: 2025-11-28 16:27:27.038 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:27 np0005538960 podman[219469]: 2025-11-28 16:27:27.172042925 +0000 UTC m=+0.064049741 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Nov 28 11:27:27 np0005538960 nova_compute[187252]: 2025-11-28 16:27:27.309 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:27:28 np0005538960 nova_compute[187252]: 2025-11-28 16:27:28.013 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:30 np0005538960 nova_compute[187252]: 2025-11-28 16:27:30.957 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:31 np0005538960 podman[219490]: 2025-11-28 16:27:31.161931898 +0000 UTC m=+0.053522353 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:27:33 np0005538960 nova_compute[187252]: 2025-11-28 16:27:33.015 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:35 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:35Z|00141|binding|INFO|Releasing lport 13f43c4e-df41-48fa-9d27-34cbd88f8c17 from this chassis (sb_readonly=0)
Nov 28 11:27:35 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:35Z|00142|binding|INFO|Releasing lport 065b851d-69a4-49d0-a066-f5c141f99961 from this chassis (sb_readonly=0)
Nov 28 11:27:35 np0005538960 nova_compute[187252]: 2025-11-28 16:27:35.515 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:35 np0005538960 nova_compute[187252]: 2025-11-28 16:27:35.914 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347240.912661, 4a3166d7-3401-4240-94f1-f56e885f648e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:27:35 np0005538960 nova_compute[187252]: 2025-11-28 16:27:35.914 187256 INFO nova.compute.manager [-] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:27:35 np0005538960 nova_compute[187252]: 2025-11-28 16:27:35.936 187256 DEBUG nova.compute.manager [None req-df43e007-2890-4b6d-8f57-0f7cf6a34f08 - - - - - -] [instance: 4a3166d7-3401-4240-94f1-f56e885f648e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:27:35 np0005538960 nova_compute[187252]: 2025-11-28 16:27:35.958 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:37 np0005538960 podman[219518]: 2025-11-28 16:27:37.218807539 +0000 UTC m=+0.116281915 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251125)
Nov 28 11:27:38 np0005538960 nova_compute[187252]: 2025-11-28 16:27:38.018 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:40 np0005538960 nova_compute[187252]: 2025-11-28 16:27:40.960 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:41 np0005538960 podman[219546]: 2025-11-28 16:27:41.161705577 +0000 UTC m=+0.055154690 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:27:41 np0005538960 podman[219545]: 2025-11-28 16:27:41.18786697 +0000 UTC m=+0.074977350 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 11:27:41 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:41Z|00143|binding|INFO|Releasing lport 13f43c4e-df41-48fa-9d27-34cbd88f8c17 from this chassis (sb_readonly=0)
Nov 28 11:27:41 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:41Z|00144|binding|INFO|Releasing lport 065b851d-69a4-49d0-a066-f5c141f99961 from this chassis (sb_readonly=0)
Nov 28 11:27:41 np0005538960 nova_compute[187252]: 2025-11-28 16:27:41.967 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:43 np0005538960 nova_compute[187252]: 2025-11-28 16:27:43.029 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:45 np0005538960 nova_compute[187252]: 2025-11-28 16:27:45.964 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 podman[219584]: 2025-11-28 16:27:46.148952045 +0000 UTC m=+0.057912043 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.243 187256 DEBUG oslo_concurrency.lockutils [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "interface-ecbea330-ccac-4a01-a80b-0c10a2f686e2-c5a5eead-793d-43f8-8cc3-a792eb3d80f8" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.244 187256 DEBUG oslo_concurrency.lockutils [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "interface-ecbea330-ccac-4a01-a80b-0c10a2f686e2-c5a5eead-793d-43f8-8cc3-a792eb3d80f8" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.271 187256 DEBUG nova.objects.instance [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'flavor' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.290 187256 DEBUG nova.virt.libvirt.vif [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:25:55Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.291 187256 DEBUG nova.network.os_vif_util [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.292 187256 DEBUG nova.network.os_vif_util [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.294 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.296 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.298 187256 DEBUG nova.virt.libvirt.driver [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Attempting to detach device tapc5a5eead-79 from instance ecbea330-ccac-4a01-a80b-0c10a2f686e2 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.299 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] detach device xml: <interface type="ethernet">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <mac address="fa:16:3e:c0:6a:b5"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <model type="virtio"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <mtu size="1442"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <target dev="tapc5a5eead-79"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: </interface>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.317 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.321 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface>not found in domain: <domain type='kvm' id='9'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <name>instance-0000001a</name>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <uuid>ecbea330-ccac-4a01-a80b-0c10a2f686e2</uuid>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:name>tempest-TestNetworkBasicOps-server-1122100469</nova:name>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:creationTime>2025-11-28 16:26:31</nova:creationTime>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:flavor name="m1.nano">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:memory>128</nova:memory>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:disk>1</nova:disk>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:swap>0</nova:swap>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:vcpus>1</nova:vcpus>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:flavor>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:owner>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:owner>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:ports>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:port uuid="ccc62a21-60d5-4151-8ab5-c33149100cd0">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:port uuid="c5a5eead-793d-43f8-8cc3-a792eb3d80f8">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:ports>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: </nova:instance>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <memory unit='KiB'>131072</memory>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <vcpu placement='static'>1</vcpu>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <resource>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <partition>/machine</partition>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </resource>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <sysinfo type='smbios'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='manufacturer'>RDO</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='product'>OpenStack Compute</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='serial'>ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='uuid'>ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='family'>Virtual Machine</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <boot dev='hd'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <smbios mode='sysinfo'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <vmcoreinfo state='on'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <cpu mode='custom' match='exact' check='full'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <model fallback='forbid'>Nehalem</model>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <feature policy='require' name='x2apic'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <feature policy='require' name='hypervisor'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <feature policy='require' name='vme'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <clock offset='utc'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <timer name='pit' tickpolicy='delay'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <timer name='hpet' present='no'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <on_poweroff>destroy</on_poweroff>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <on_reboot>restart</on_reboot>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <on_crash>destroy</on_crash>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <disk type='file' device='disk'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <source file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk' index='2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <backingStore type='file' index='3'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:        <format type='raw'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:        <source file='/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:        <backingStore/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      </backingStore>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target dev='vda' bus='virtio'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='virtio-disk0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <disk type='file' device='cdrom'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <driver name='qemu' type='raw' cache='none'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <source file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.config' index='1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <backingStore/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target dev='sda' bus='sata'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <readonly/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='sata0-0-0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='0' model='pcie-root'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pcie.0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='1' port='0x10'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='2' port='0x11'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='3' port='0x12'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.3'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='4' port='0x13'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.4'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='5' port='0x14'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='6' port='0x15'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.6'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='7' port='0x16'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='8' port='0x17'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.8'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='9' port='0x18'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.9'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='10' port='0x19'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.10'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='11' port='0x1a'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.11'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='12' port='0x1b'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.12'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='13' port='0x1c'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.13'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='14' port='0x1d'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.14'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='15' port='0x1e'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.15'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='16' port='0x1f'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.16'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='17' port='0x20'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.17'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='18' port='0x21'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.18'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='19' port='0x22'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.19'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='20' port='0x23'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.20'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='21' port='0x24'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.21'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='22' port='0x25'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.22'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='23' port='0x26'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.23'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='24' port='0x27'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.24'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='25' port='0x28'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.25'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-pci-bridge'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.26'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='usb'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='sata' index='0'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='ide'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <interface type='ethernet'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <mac address='fa:16:3e:a3:d6:a7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target dev='tapccc62a21-60'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model type='virtio'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <mtu size='1442'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='net0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <interface type='ethernet'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <mac address='fa:16:3e:c0:6a:b5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target dev='tapc5a5eead-79'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model type='virtio'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <mtu size='1442'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='net1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <serial type='pty'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <source path='/dev/pts/0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <log file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log' append='off'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target type='isa-serial' port='0'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:        <model name='isa-serial'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      </target>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='serial0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <console type='pty' tty='/dev/pts/0'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <source path='/dev/pts/0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <log file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log' append='off'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target type='serial' port='0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='serial0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </console>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <input type='tablet' bus='usb'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='input0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='usb' bus='0' port='1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <input type='mouse' bus='ps2'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='input1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <input type='keyboard' bus='ps2'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='input2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <listen type='address' address='::0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </graphics>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <audio id='1' type='none'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model type='virtio' heads='1' primary='yes'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='video0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <watchdog model='itco' action='reset'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='watchdog0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </watchdog>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <memballoon model='virtio'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <stats period='10'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='balloon0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <rng model='virtio'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <backend model='random'>/dev/urandom</backend>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='rng0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <label>system_u:system_r:svirt_t:s0:c563,c890</label>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c563,c890</imagelabel>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </seclabel>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <label>+107:+107</label>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <imagelabel>+107:+107</imagelabel>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </seclabel>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.322 187256 INFO nova.virt.libvirt.driver [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully detached device tapc5a5eead-79 from instance ecbea330-ccac-4a01-a80b-0c10a2f686e2 from the persistent domain config.#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.322 187256 DEBUG nova.virt.libvirt.driver [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] (1/8): Attempting to detach device tapc5a5eead-79 with device alias net1 from instance ecbea330-ccac-4a01-a80b-0c10a2f686e2 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.322 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] detach device xml: <interface type="ethernet">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <mac address="fa:16:3e:c0:6a:b5"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <model type="virtio"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <mtu size="1442"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <target dev="tapc5a5eead-79"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: </interface>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Nov 28 11:27:46 np0005538960 kernel: tapc5a5eead-79 (unregistering): left promiscuous mode
Nov 28 11:27:46 np0005538960 NetworkManager[55548]: <info>  [1764347266.4300] device (tapc5a5eead-79): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:27:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:46Z|00145|binding|INFO|Releasing lport c5a5eead-793d-43f8-8cc3-a792eb3d80f8 from this chassis (sb_readonly=0)
Nov 28 11:27:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:46Z|00146|binding|INFO|Setting lport c5a5eead-793d-43f8-8cc3-a792eb3d80f8 down in Southbound
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.435 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:46Z|00147|binding|INFO|Removing iface tapc5a5eead-79 ovn-installed in OVS
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.437 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.441 187256 DEBUG nova.virt.libvirt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Received event <DeviceRemovedEvent: 1764347266.441237, ecbea330-ccac-4a01-a80b-0c10a2f686e2 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.444 187256 DEBUG nova.virt.libvirt.driver [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Start waiting for the detach event from libvirt for device tapc5a5eead-79 with device alias net1 for instance ecbea330-ccac-4a01-a80b-0c10a2f686e2 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.445 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.446 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:6a:b5 10.100.0.27', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f663f661-7b7b-4edb-989d-ff8406790f59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0a41596a-bfeb-4aa8-962b-05fdfeac8603, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=c5a5eead-793d-43f8-8cc3-a792eb3d80f8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.448 104369 INFO neutron.agent.ovn.metadata.agent [-] Port c5a5eead-793d-43f8-8cc3-a792eb3d80f8 in datapath f663f661-7b7b-4edb-989d-ff8406790f59 unbound from our chassis#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.448 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.449 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface>not found in domain: <domain type='kvm' id='9'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <name>instance-0000001a</name>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <uuid>ecbea330-ccac-4a01-a80b-0c10a2f686e2</uuid>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:name>tempest-TestNetworkBasicOps-server-1122100469</nova:name>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:creationTime>2025-11-28 16:26:31</nova:creationTime>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:flavor name="m1.nano">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:memory>128</nova:memory>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:disk>1</nova:disk>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:swap>0</nova:swap>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:vcpus>1</nova:vcpus>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:flavor>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:owner>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:owner>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:ports>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:port uuid="ccc62a21-60d5-4151-8ab5-c33149100cd0">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:port uuid="c5a5eead-793d-43f8-8cc3-a792eb3d80f8">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:ports>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: </nova:instance>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <memory unit='KiB'>131072</memory>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <vcpu placement='static'>1</vcpu>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <resource>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <partition>/machine</partition>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </resource>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <sysinfo type='smbios'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='manufacturer'>RDO</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='product'>OpenStack Compute</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='serial'>ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='uuid'>ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <entry name='family'>Virtual Machine</entry>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <boot dev='hd'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <smbios mode='sysinfo'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <vmcoreinfo state='on'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <cpu mode='custom' match='exact' check='full'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <model fallback='forbid'>Nehalem</model>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <feature policy='require' name='x2apic'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <feature policy='require' name='hypervisor'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <feature policy='require' name='vme'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <clock offset='utc'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <timer name='pit' tickpolicy='delay'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <timer name='hpet' present='no'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <on_poweroff>destroy</on_poweroff>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <on_reboot>restart</on_reboot>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <on_crash>destroy</on_crash>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <disk type='file' device='disk'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <source file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk' index='2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <backingStore type='file' index='3'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:        <format type='raw'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:        <source file='/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:        <backingStore/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      </backingStore>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target dev='vda' bus='virtio'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='virtio-disk0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <disk type='file' device='cdrom'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <driver name='qemu' type='raw' cache='none'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <source file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.config' index='1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <backingStore/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target dev='sda' bus='sata'/>
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.450 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f663f661-7b7b-4edb-989d-ff8406790f59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <readonly/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='sata0-0-0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='0' model='pcie-root'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pcie.0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='1' port='0x10'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='2' port='0x11'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 28 11:27:46 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='3' port='0x12'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.3'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='4' port='0x13'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.4'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='5' port='0x14'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='6' port='0x15'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.6'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='7' port='0x16'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='8' port='0x17'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.8'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='9' port='0x18'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.9'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='10' port='0x19'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.10'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='11' port='0x1a'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.11'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='12' port='0x1b'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.12'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='13' port='0x1c'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.13'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='14' port='0x1d'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.14'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='15' port='0x1e'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.15'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='16' port='0x1f'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.16'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='17' port='0x20'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.17'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='18' port='0x21'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.18'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='19' port='0x22'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.19'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='20' port='0x23'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.20'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='21' port='0x24'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.21'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='22' port='0x25'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.22'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='23' port='0x26'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.23'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='24' port='0x27'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.24'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target chassis='25' port='0x28'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.25'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model name='pcie-pci-bridge'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='pci.26'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='usb'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <controller type='sata' index='0'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='ide'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <interface type='ethernet'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <mac address='fa:16:3e:a3:d6:a7'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target dev='tapccc62a21-60'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model type='virtio'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <mtu size='1442'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='net0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <serial type='pty'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <source path='/dev/pts/0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <log file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log' append='off'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target type='isa-serial' port='0'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:        <model name='isa-serial'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      </target>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='serial0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <console type='pty' tty='/dev/pts/0'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <source path='/dev/pts/0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <log file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log' append='off'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <target type='serial' port='0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='serial0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </console>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <input type='tablet' bus='usb'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='input0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='usb' bus='0' port='1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <input type='mouse' bus='ps2'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='input1'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <input type='keyboard' bus='ps2'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='input2'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <listen type='address' address='::0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </graphics>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <audio id='1' type='none'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <model type='virtio' heads='1' primary='yes'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='video0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <watchdog model='itco' action='reset'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='watchdog0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </watchdog>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <memballoon model='virtio'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <stats period='10'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='balloon0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <rng model='virtio'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <backend model='random'>/dev/urandom</backend>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <alias name='rng0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <label>system_u:system_r:svirt_t:s0:c563,c890</label>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c563,c890</imagelabel>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </seclabel>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <label>+107:+107</label>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <imagelabel>+107:+107</imagelabel>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </seclabel>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.449 187256 INFO nova.virt.libvirt.driver [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully detached device tapc5a5eead-79 from instance ecbea330-ccac-4a01-a80b-0c10a2f686e2 from the live domain config.#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.450 187256 DEBUG nova.virt.libvirt.vif [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:25:55Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.450 187256 DEBUG nova.network.os_vif_util [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.451 187256 DEBUG nova.network.os_vif_util [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.452 187256 DEBUG os_vif [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.453 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[21a53d43-e820-4fe2-9069-d0ce1699ffce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.453 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59 namespace which is not needed anymore#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.453 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.454 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5a5eead-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.456 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.458 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.460 187256 INFO os_vif [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79')#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.461 187256 DEBUG nova.virt.libvirt.guest [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:name>tempest-TestNetworkBasicOps-server-1122100469</nova:name>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:creationTime>2025-11-28 16:27:46</nova:creationTime>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:flavor name="m1.nano">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:memory>128</nova:memory>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:disk>1</nova:disk>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:swap>0</nova:swap>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:vcpus>1</nova:vcpus>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:flavor>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:owner>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:owner>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  <nova:ports>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    <nova:port uuid="ccc62a21-60d5-4151-8ab5-c33149100cd0">
Nov 28 11:27:46 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:27:46 np0005538960 nova_compute[187252]:  </nova:ports>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: </nova:instance>
Nov 28 11:27:46 np0005538960 nova_compute[187252]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 28 11:27:46 np0005538960 rsyslogd[1008]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Nov 28 11:27:46 np0005538960 neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59[218769]: [NOTICE]   (218773) : haproxy version is 2.8.14-c23fe91
Nov 28 11:27:46 np0005538960 neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59[218769]: [NOTICE]   (218773) : path to executable is /usr/sbin/haproxy
Nov 28 11:27:46 np0005538960 neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59[218769]: [WARNING]  (218773) : Exiting Master process...
Nov 28 11:27:46 np0005538960 neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59[218769]: [ALERT]    (218773) : Current worker (218775) exited with code 143 (Terminated)
Nov 28 11:27:46 np0005538960 neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59[218769]: [WARNING]  (218773) : All workers exited. Exiting... (0)
Nov 28 11:27:46 np0005538960 systemd[1]: libpod-244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad.scope: Deactivated successfully.
Nov 28 11:27:46 np0005538960 podman[219634]: 2025-11-28 16:27:46.595983102 +0000 UTC m=+0.048380576 container died 244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 11:27:46 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad-userdata-shm.mount: Deactivated successfully.
Nov 28 11:27:46 np0005538960 systemd[1]: var-lib-containers-storage-overlay-60ba8d0cc65fb1b4bad4eb43253fa522e7f8d89472a8857fb8cd60de8023e023-merged.mount: Deactivated successfully.
Nov 28 11:27:46 np0005538960 podman[219634]: 2025-11-28 16:27:46.642868555 +0000 UTC m=+0.095266029 container cleanup 244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:27:46 np0005538960 systemd[1]: libpod-conmon-244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad.scope: Deactivated successfully.
Nov 28 11:27:46 np0005538960 podman[219667]: 2025-11-28 16:27:46.711365087 +0000 UTC m=+0.045101573 container remove 244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.720 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ec7a927a-b3b0-4f32-a193-5ce4665f77ea]: (4, ('Fri Nov 28 04:27:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59 (244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad)\n244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad\nFri Nov 28 04:27:46 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59 (244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad)\n244d7662e35534cedca849867946f0514bacc4cb1d1cd33e53de64fe8c7890ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.724 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe824ec-86a9-468d-80c5-46918716d3dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.725 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf663f661-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.728 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 kernel: tapf663f661-70: left promiscuous mode
Nov 28 11:27:46 np0005538960 nova_compute[187252]: 2025-11-28 16:27:46.740 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.744 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c51977ce-e8e9-44a2-a6c1-e787185e98c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.759 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c2bfe770-6623-4d65-80d3-30d1bc671b42]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.760 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5055b0-1f5b-4805-b856-97d4a0d69763]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.779 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ce6bc58d-4f8d-4079-ab77-4d5227c363fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 429059, 'reachable_time': 42547, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219682, 'error': None, 'target': 'ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.782 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f663f661-7b7b-4edb-989d-ff8406790f59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:27:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:46.782 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[24c8ea11-cf59-4bfd-81b4-c3ef7bc480b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:46 np0005538960 systemd[1]: run-netns-ovnmeta\x2df663f661\x2d7b7b\x2d4edb\x2d989d\x2dff8406790f59.mount: Deactivated successfully.
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.116 187256 DEBUG nova.compute.manager [req-2212531c-eb96-45bb-8b32-3291a2543f90 req-9793dc51-b013-462f-9861-889444ad0445 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-unplugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.116 187256 DEBUG oslo_concurrency.lockutils [req-2212531c-eb96-45bb-8b32-3291a2543f90 req-9793dc51-b013-462f-9861-889444ad0445 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.116 187256 DEBUG oslo_concurrency.lockutils [req-2212531c-eb96-45bb-8b32-3291a2543f90 req-9793dc51-b013-462f-9861-889444ad0445 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.117 187256 DEBUG oslo_concurrency.lockutils [req-2212531c-eb96-45bb-8b32-3291a2543f90 req-9793dc51-b013-462f-9861-889444ad0445 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.117 187256 DEBUG nova.compute.manager [req-2212531c-eb96-45bb-8b32-3291a2543f90 req-9793dc51-b013-462f-9861-889444ad0445 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] No waiting events found dispatching network-vif-unplugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.117 187256 WARNING nova.compute.manager [req-2212531c-eb96-45bb-8b32-3291a2543f90 req-9793dc51-b013-462f-9861-889444ad0445 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received unexpected event network-vif-unplugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.721 187256 DEBUG oslo_concurrency.lockutils [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.722 187256 DEBUG oslo_concurrency.lockutils [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:27:47 np0005538960 nova_compute[187252]: 2025-11-28 16:27:47.722 187256 DEBUG nova.network.neutron [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:27:48 np0005538960 nova_compute[187252]: 2025-11-28 16:27:48.032 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:48 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:48Z|00148|binding|INFO|Releasing lport 065b851d-69a4-49d0-a066-f5c141f99961 from this chassis (sb_readonly=0)
Nov 28 11:27:48 np0005538960 nova_compute[187252]: 2025-11-28 16:27:48.959 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.246 187256 DEBUG nova.compute.manager [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.246 187256 DEBUG oslo_concurrency.lockutils [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.246 187256 DEBUG oslo_concurrency.lockutils [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.247 187256 DEBUG oslo_concurrency.lockutils [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.247 187256 DEBUG nova.compute.manager [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] No waiting events found dispatching network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.247 187256 WARNING nova.compute.manager [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received unexpected event network-vif-plugged-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.247 187256 DEBUG nova.compute.manager [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-deleted-c5a5eead-793d-43f8-8cc3-a792eb3d80f8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.247 187256 INFO nova.compute.manager [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Neutron deleted interface c5a5eead-793d-43f8-8cc3-a792eb3d80f8; detaching it from the instance and deleting it from the info cache#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.248 187256 DEBUG nova.network.neutron [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.276 187256 DEBUG nova.objects.instance [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lazy-loading 'system_metadata' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.309 187256 DEBUG nova.objects.instance [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lazy-loading 'flavor' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.347 187256 DEBUG nova.virt.libvirt.vif [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:25:55Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.348 187256 DEBUG nova.network.os_vif_util [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Converting VIF {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.349 187256 DEBUG nova.network.os_vif_util [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.353 187256 DEBUG nova.virt.libvirt.guest [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.357 187256 DEBUG nova.virt.libvirt.guest [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface>not found in domain: <domain type='kvm' id='9'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <name>instance-0000001a</name>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <uuid>ecbea330-ccac-4a01-a80b-0c10a2f686e2</uuid>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:name>tempest-TestNetworkBasicOps-server-1122100469</nova:name>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:creationTime>2025-11-28 16:27:46</nova:creationTime>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:flavor name="m1.nano">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:memory>128</nova:memory>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:disk>1</nova:disk>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:swap>0</nova:swap>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:vcpus>1</nova:vcpus>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:flavor>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:owner>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:owner>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:ports>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:port uuid="ccc62a21-60d5-4151-8ab5-c33149100cd0">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:ports>
Nov 28 11:27:49 np0005538960 nova_compute[187252]: </nova:instance>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <memory unit='KiB'>131072</memory>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <vcpu placement='static'>1</vcpu>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <resource>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <partition>/machine</partition>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </resource>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <sysinfo type='smbios'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='manufacturer'>RDO</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='product'>OpenStack Compute</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='serial'>ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='uuid'>ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='family'>Virtual Machine</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <boot dev='hd'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <smbios mode='sysinfo'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <vmcoreinfo state='on'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <cpu mode='custom' match='exact' check='full'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <model fallback='forbid'>Nehalem</model>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <feature policy='require' name='x2apic'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <feature policy='require' name='hypervisor'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <feature policy='require' name='vme'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <clock offset='utc'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <timer name='pit' tickpolicy='delay'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <timer name='hpet' present='no'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <on_poweroff>destroy</on_poweroff>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <on_reboot>restart</on_reboot>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <on_crash>destroy</on_crash>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <disk type='file' device='disk'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <source file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk' index='2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <backingStore type='file' index='3'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:        <format type='raw'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:        <source file='/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:        <backingStore/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      </backingStore>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target dev='vda' bus='virtio'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='virtio-disk0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <disk type='file' device='cdrom'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <driver name='qemu' type='raw' cache='none'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <source file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.config' index='1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <backingStore/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target dev='sda' bus='sata'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <readonly/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='sata0-0-0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='0' model='pcie-root'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pcie.0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='1' port='0x10'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='2' port='0x11'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='3' port='0x12'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.3'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='4' port='0x13'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.4'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='5' port='0x14'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.5'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='6' port='0x15'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.6'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='7' port='0x16'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='8' port='0x17'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.8'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='9' port='0x18'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.9'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='10' port='0x19'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.10'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='11' port='0x1a'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.11'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='12' port='0x1b'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.12'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='13' port='0x1c'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.13'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='14' port='0x1d'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.14'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='15' port='0x1e'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.15'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='16' port='0x1f'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.16'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='17' port='0x20'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.17'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='18' port='0x21'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.18'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='19' port='0x22'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.19'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='20' port='0x23'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.20'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='21' port='0x24'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.21'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='22' port='0x25'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.22'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='23' port='0x26'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.23'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='24' port='0x27'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.24'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='25' port='0x28'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.25'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-pci-bridge'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.26'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='usb'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='sata' index='0'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='ide'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <interface type='ethernet'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <mac address='fa:16:3e:a3:d6:a7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target dev='tapccc62a21-60'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model type='virtio'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <mtu size='1442'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='net0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <serial type='pty'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <source path='/dev/pts/0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <log file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log' append='off'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target type='isa-serial' port='0'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:        <model name='isa-serial'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      </target>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='serial0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <console type='pty' tty='/dev/pts/0'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <source path='/dev/pts/0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <log file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log' append='off'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target type='serial' port='0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='serial0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </console>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <input type='tablet' bus='usb'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='input0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='usb' bus='0' port='1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <input type='mouse' bus='ps2'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='input1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <input type='keyboard' bus='ps2'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='input2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <listen type='address' address='::0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </graphics>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <audio id='1' type='none'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model type='virtio' heads='1' primary='yes'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='video0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <watchdog model='itco' action='reset'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='watchdog0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </watchdog>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <memballoon model='virtio'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <stats period='10'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='balloon0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <rng model='virtio'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <backend model='random'>/dev/urandom</backend>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='rng0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <label>system_u:system_r:svirt_t:s0:c563,c890</label>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c563,c890</imagelabel>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </seclabel>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <label>+107:+107</label>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <imagelabel>+107:+107</imagelabel>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </seclabel>
Nov 28 11:27:49 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:27:49 np0005538960 nova_compute[187252]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.358 187256 DEBUG nova.virt.libvirt.guest [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.361 187256 DEBUG nova.virt.libvirt.guest [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:c0:6a:b5"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapc5a5eead-79"/></interface>not found in domain: <domain type='kvm' id='9'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <name>instance-0000001a</name>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <uuid>ecbea330-ccac-4a01-a80b-0c10a2f686e2</uuid>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:name>tempest-TestNetworkBasicOps-server-1122100469</nova:name>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:creationTime>2025-11-28 16:27:46</nova:creationTime>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:flavor name="m1.nano">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:memory>128</nova:memory>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:disk>1</nova:disk>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:swap>0</nova:swap>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:vcpus>1</nova:vcpus>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:flavor>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:owner>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:owner>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:ports>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:port uuid="ccc62a21-60d5-4151-8ab5-c33149100cd0">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:ports>
Nov 28 11:27:49 np0005538960 nova_compute[187252]: </nova:instance>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <memory unit='KiB'>131072</memory>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <currentMemory unit='KiB'>131072</currentMemory>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <vcpu placement='static'>1</vcpu>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <resource>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <partition>/machine</partition>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </resource>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <sysinfo type='smbios'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='manufacturer'>RDO</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='product'>OpenStack Compute</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='serial'>ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='uuid'>ecbea330-ccac-4a01-a80b-0c10a2f686e2</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <entry name='family'>Virtual Machine</entry>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <boot dev='hd'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <smbios mode='sysinfo'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <vmcoreinfo state='on'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <cpu mode='custom' match='exact' check='full'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <model fallback='forbid'>Nehalem</model>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <feature policy='require' name='x2apic'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <feature policy='require' name='hypervisor'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <feature policy='require' name='vme'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <clock offset='utc'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <timer name='pit' tickpolicy='delay'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <timer name='rtc' tickpolicy='catchup'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <timer name='hpet' present='no'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <on_poweroff>destroy</on_poweroff>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <on_reboot>restart</on_reboot>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <on_crash>destroy</on_crash>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <disk type='file' device='disk'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <driver name='qemu' type='qcow2' cache='none'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <source file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk' index='2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <backingStore type='file' index='3'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:        <format type='raw'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:        <source file='/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:        <backingStore/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      </backingStore>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target dev='vda' bus='virtio'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='virtio-disk0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <disk type='file' device='cdrom'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <driver name='qemu' type='raw' cache='none'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <source file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/disk.config' index='1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <backingStore/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target dev='sda' bus='sata'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <readonly/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='sata0-0-0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='0' model='pcie-root'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pcie.0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='1' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='1' port='0x10'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='2' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='2' port='0x11'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='3' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='3' port='0x12'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.3'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='4' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='4' port='0x13'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.4'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='5' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='5' port='0x14'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.5'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='6' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='6' port='0x15'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.6'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='7' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='7' port='0x16'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='8' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='8' port='0x17'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.8'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='9' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='9' port='0x18'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.9'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='10' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='10' port='0x19'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.10'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='11' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='11' port='0x1a'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.11'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='12' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='12' port='0x1b'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.12'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='13' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='13' port='0x1c'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.13'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='14' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='14' port='0x1d'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.14'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='15' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='15' port='0x1e'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.15'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='16' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='16' port='0x1f'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.16'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='17' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='17' port='0x20'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.17'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='18' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='18' port='0x21'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.18'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='19' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='19' port='0x22'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.19'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='20' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='20' port='0x23'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.20'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='21' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='21' port='0x24'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.21'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='22' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='22' port='0x25'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.22'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='23' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='23' port='0x26'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.23'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='24' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='24' port='0x27'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.24'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='25' model='pcie-root-port'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-root-port'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target chassis='25' port='0x28'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.25'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model name='pcie-pci-bridge'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='pci.26'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='usb' index='0' model='piix3-uhci'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='usb'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <controller type='sata' index='0'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='ide'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </controller>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <interface type='ethernet'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <mac address='fa:16:3e:a3:d6:a7'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target dev='tapccc62a21-60'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model type='virtio'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <driver name='vhost' rx_queue_size='512'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <mtu size='1442'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='net0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <serial type='pty'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <source path='/dev/pts/0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <log file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log' append='off'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target type='isa-serial' port='0'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:        <model name='isa-serial'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      </target>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='serial0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <console type='pty' tty='/dev/pts/0'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <source path='/dev/pts/0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <log file='/var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2/console.log' append='off'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <target type='serial' port='0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='serial0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </console>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <input type='tablet' bus='usb'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='input0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='usb' bus='0' port='1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <input type='mouse' bus='ps2'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='input1'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <input type='keyboard' bus='ps2'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='input2'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </input>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <listen type='address' address='::0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </graphics>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <audio id='1' type='none'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <model type='virtio' heads='1' primary='yes'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='video0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <watchdog model='itco' action='reset'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='watchdog0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </watchdog>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <memballoon model='virtio'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <stats period='10'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='balloon0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <rng model='virtio'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <backend model='random'>/dev/urandom</backend>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <alias name='rng0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <label>system_u:system_r:svirt_t:s0:c563,c890</label>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c563,c890</imagelabel>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </seclabel>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <label>+107:+107</label>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <imagelabel>+107:+107</imagelabel>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </seclabel>
Nov 28 11:27:49 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:27:49 np0005538960 nova_compute[187252]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.361 187256 WARNING nova.virt.libvirt.driver [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Detaching interface fa:16:3e:c0:6a:b5 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapc5a5eead-79' not found.#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.362 187256 DEBUG nova.virt.libvirt.vif [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:25:55Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.362 187256 DEBUG nova.network.os_vif_util [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Converting VIF {"id": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "address": "fa:16:3e:c0:6a:b5", "network": {"id": "f663f661-7b7b-4edb-989d-ff8406790f59", "bridge": "br-int", "label": "tempest-network-smoke--26385101", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc5a5eead-79", "ovs_interfaceid": "c5a5eead-793d-43f8-8cc3-a792eb3d80f8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.363 187256 DEBUG nova.network.os_vif_util [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.363 187256 DEBUG os_vif [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.366 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.366 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5a5eead-79, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.366 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.368 187256 INFO os_vif [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:6a:b5,bridge_name='br-int',has_traffic_filtering=True,id=c5a5eead-793d-43f8-8cc3-a792eb3d80f8,network=Network(f663f661-7b7b-4edb-989d-ff8406790f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc5a5eead-79')#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.369 187256 DEBUG nova.virt.libvirt.guest [req-6baca783-d9c6-4464-bd41-0166f19e1963 req-4821d1d1-4e01-4621-a515-a933f7a18e55 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:name>tempest-TestNetworkBasicOps-server-1122100469</nova:name>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:creationTime>2025-11-28 16:27:49</nova:creationTime>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:flavor name="m1.nano">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:memory>128</nova:memory>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:disk>1</nova:disk>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:swap>0</nova:swap>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:vcpus>1</nova:vcpus>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:flavor>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:owner>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:owner>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  <nova:ports>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    <nova:port uuid="ccc62a21-60d5-4151-8ab5-c33149100cd0">
Nov 28 11:27:49 np0005538960 nova_compute[187252]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:    </nova:port>
Nov 28 11:27:49 np0005538960 nova_compute[187252]:  </nova:ports>
Nov 28 11:27:49 np0005538960 nova_compute[187252]: </nova:instance>
Nov 28 11:27:49 np0005538960 nova_compute[187252]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.944 187256 DEBUG nova.compute.manager [req-883f881e-8ca8-4816-8b67-d1c43c24f80a req-b05a8400-6939-4e11-9bf9-58e7d0c2ef66 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-changed-ccc62a21-60d5-4151-8ab5-c33149100cd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.944 187256 DEBUG nova.compute.manager [req-883f881e-8ca8-4816-8b67-d1c43c24f80a req-b05a8400-6939-4e11-9bf9-58e7d0c2ef66 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing instance network info cache due to event network-changed-ccc62a21-60d5-4151-8ab5-c33149100cd0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:27:49 np0005538960 nova_compute[187252]: 2025-11-28 16:27:49.944 187256 DEBUG oslo_concurrency.lockutils [req-883f881e-8ca8-4816-8b67-d1c43c24f80a req-b05a8400-6939-4e11-9bf9-58e7d0c2ef66 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:27:50 np0005538960 podman[219683]: 2025-11-28 16:27:50.152426605 +0000 UTC m=+0.060823848 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.200 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.200 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.201 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.201 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.201 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.203 187256 INFO nova.compute.manager [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Terminating instance#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.204 187256 DEBUG nova.compute.manager [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:27:50 np0005538960 kernel: tapccc62a21-60 (unregistering): left promiscuous mode
Nov 28 11:27:50 np0005538960 NetworkManager[55548]: <info>  [1764347270.2264] device (tapccc62a21-60): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.232 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:50 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:50Z|00149|binding|INFO|Releasing lport ccc62a21-60d5-4151-8ab5-c33149100cd0 from this chassis (sb_readonly=0)
Nov 28 11:27:50 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:50Z|00150|binding|INFO|Setting lport ccc62a21-60d5-4151-8ab5-c33149100cd0 down in Southbound
Nov 28 11:27:50 np0005538960 ovn_controller[95460]: 2025-11-28T16:27:50Z|00151|binding|INFO|Removing iface tapccc62a21-60 ovn-installed in OVS
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.235 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.240 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:d6:a7 10.100.0.14'], port_security=['fa:16:3e:a3:d6:a7 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ecbea330-ccac-4a01-a80b-0c10a2f686e2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-577c0581-66ed-41fb-8a29-0e25a0007ac2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'da1cc23b-094e-481a-bce5-3cc0ef981d77', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b5b2475e-34ed-477b-9001-b455c0e4d7e2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=ccc62a21-60d5-4151-8ab5-c33149100cd0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.242 104369 INFO neutron.agent.ovn.metadata.agent [-] Port ccc62a21-60d5-4151-8ab5-c33149100cd0 in datapath 577c0581-66ed-41fb-8a29-0e25a0007ac2 unbound from our chassis#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.243 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 577c0581-66ed-41fb-8a29-0e25a0007ac2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.244 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b5dfa490-f993-4833-a8cc-2ef3cfd70980]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.245 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2 namespace which is not needed anymore#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.251 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:50 np0005538960 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Nov 28 11:27:50 np0005538960 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001a.scope: Consumed 19.236s CPU time.
Nov 28 11:27:50 np0005538960 systemd-machined[153518]: Machine qemu-9-instance-0000001a terminated.
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.351 187256 INFO nova.network.neutron [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Port c5a5eead-793d-43f8-8cc3-a792eb3d80f8 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.351 187256 DEBUG nova.network.neutron [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.367 187256 DEBUG oslo_concurrency.lockutils [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.370 187256 DEBUG oslo_concurrency.lockutils [req-883f881e-8ca8-4816-8b67-d1c43c24f80a req-b05a8400-6939-4e11-9bf9-58e7d0c2ef66 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.371 187256 DEBUG nova.network.neutron [req-883f881e-8ca8-4816-8b67-d1c43c24f80a req-b05a8400-6939-4e11-9bf9-58e7d0c2ef66 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Refreshing network info cache for port ccc62a21-60d5-4151-8ab5-c33149100cd0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:27:50 np0005538960 neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2[218439]: [NOTICE]   (218443) : haproxy version is 2.8.14-c23fe91
Nov 28 11:27:50 np0005538960 neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2[218439]: [NOTICE]   (218443) : path to executable is /usr/sbin/haproxy
Nov 28 11:27:50 np0005538960 neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2[218439]: [WARNING]  (218443) : Exiting Master process...
Nov 28 11:27:50 np0005538960 neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2[218439]: [WARNING]  (218443) : Exiting Master process...
Nov 28 11:27:50 np0005538960 neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2[218439]: [ALERT]    (218443) : Current worker (218445) exited with code 143 (Terminated)
Nov 28 11:27:50 np0005538960 neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2[218439]: [WARNING]  (218443) : All workers exited. Exiting... (0)
Nov 28 11:27:50 np0005538960 systemd[1]: libpod-fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e.scope: Deactivated successfully.
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.391 187256 DEBUG oslo_concurrency.lockutils [None req-61e091e2-14e5-44ac-a2a7-7b8c1966c7bd a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "interface-ecbea330-ccac-4a01-a80b-0c10a2f686e2-c5a5eead-793d-43f8-8cc3-a792eb3d80f8" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:50 np0005538960 podman[219727]: 2025-11-28 16:27:50.392766771 +0000 UTC m=+0.046870023 container died fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:27:50 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e-userdata-shm.mount: Deactivated successfully.
Nov 28 11:27:50 np0005538960 systemd[1]: var-lib-containers-storage-overlay-b66ce6399351065bf1c77a1efda1b1cff97cc642369591071f80a7143088280e-merged.mount: Deactivated successfully.
Nov 28 11:27:50 np0005538960 podman[219727]: 2025-11-28 16:27:50.43601731 +0000 UTC m=+0.090120562 container cleanup fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:27:50 np0005538960 systemd[1]: libpod-conmon-fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e.scope: Deactivated successfully.
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.469 187256 INFO nova.virt.libvirt.driver [-] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Instance destroyed successfully.#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.470 187256 DEBUG nova.objects.instance [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'resources' on Instance uuid ecbea330-ccac-4a01-a80b-0c10a2f686e2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.484 187256 DEBUG nova.virt.libvirt.vif [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:25:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1122100469',display_name='tempest-TestNetworkBasicOps-server-1122100469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1122100469',id=26,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIlPWyvkROh7MBxD8hU490Sb9a2OSX3b0N0u427OSfQDYupob+Q87e0mTGtZ03o9uU2OcrEOzyX3GaZpnMoGT/Lwyo3imGuadY4jiKIo2URn+d5N+y/vPBH3pm/LOkXk6Q==',key_name='tempest-TestNetworkBasicOps-1436030712',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:25:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-i3jyimxo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:25:55Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=ecbea330-ccac-4a01-a80b-0c10a2f686e2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.485 187256 DEBUG nova.network.os_vif_util [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.485 187256 DEBUG nova.network.os_vif_util [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:d6:a7,bridge_name='br-int',has_traffic_filtering=True,id=ccc62a21-60d5-4151-8ab5-c33149100cd0,network=Network(577c0581-66ed-41fb-8a29-0e25a0007ac2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccc62a21-60') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.486 187256 DEBUG os_vif [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:d6:a7,bridge_name='br-int',has_traffic_filtering=True,id=ccc62a21-60d5-4151-8ab5-c33149100cd0,network=Network(577c0581-66ed-41fb-8a29-0e25a0007ac2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccc62a21-60') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.488 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.488 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapccc62a21-60, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.490 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.493 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.497 187256 INFO os_vif [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:d6:a7,bridge_name='br-int',has_traffic_filtering=True,id=ccc62a21-60d5-4151-8ab5-c33149100cd0,network=Network(577c0581-66ed-41fb-8a29-0e25a0007ac2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapccc62a21-60')#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.498 187256 INFO nova.virt.libvirt.driver [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Deleting instance files /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2_del#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.499 187256 INFO nova.virt.libvirt.driver [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Deletion of /var/lib/nova/instances/ecbea330-ccac-4a01-a80b-0c10a2f686e2_del complete#033[00m
Nov 28 11:27:50 np0005538960 podman[219766]: 2025-11-28 16:27:50.509610098 +0000 UTC m=+0.049539214 container remove fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.515 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9a0ee38b-301e-44c5-b89e-61356860f2da]: (4, ('Fri Nov 28 04:27:50 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2 (fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e)\nfbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e\nFri Nov 28 04:27:50 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2 (fbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e)\nfbd03b8ad8d33a18674518ed6c9d936c2978c32e20d63dece7d7d7b2af38ff4e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.517 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[44344c8d-3fdd-4790-aba7-03d47b736cf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.518 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap577c0581-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.520 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:50 np0005538960 kernel: tap577c0581-60: left promiscuous mode
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.533 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.536 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7383a0fd-df7a-4cc6-976f-b78180f86985]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.541 187256 INFO nova.compute.manager [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.542 187256 DEBUG oslo.service.loopingcall [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.542 187256 DEBUG nova.compute.manager [-] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:27:50 np0005538960 nova_compute[187252]: 2025-11-28 16:27:50.542 187256 DEBUG nova.network.neutron [-] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.557 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6eba8192-7581-46bc-bd6f-0b055d866cdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.559 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[61cfed0a-286d-4cea-b3fe-494e80741640]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.576 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c87f71e9-2838-4d1f-a918-8f49e359058f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 425085, 'reachable_time': 25010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219785, 'error': None, 'target': 'ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.579 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-577c0581-66ed-41fb-8a29-0e25a0007ac2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:27:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:27:50.579 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[f1770683-cca9-4a9f-8c92-987071896a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:27:50 np0005538960 systemd[1]: run-netns-ovnmeta\x2d577c0581\x2d66ed\x2d41fb\x2d8a29\x2d0e25a0007ac2.mount: Deactivated successfully.
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.271 187256 DEBUG nova.network.neutron [-] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.286 187256 INFO nova.compute.manager [-] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Took 0.74 seconds to deallocate network for instance.#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.347 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.347 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.377 187256 DEBUG nova.compute.manager [req-3e53b4ff-4539-4dde-97fe-7c7c06a25dfb req-3052a06e-a461-424f-914c-41c6fc657d34 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-unplugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.378 187256 DEBUG oslo_concurrency.lockutils [req-3e53b4ff-4539-4dde-97fe-7c7c06a25dfb req-3052a06e-a461-424f-914c-41c6fc657d34 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.378 187256 DEBUG oslo_concurrency.lockutils [req-3e53b4ff-4539-4dde-97fe-7c7c06a25dfb req-3052a06e-a461-424f-914c-41c6fc657d34 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.378 187256 DEBUG oslo_concurrency.lockutils [req-3e53b4ff-4539-4dde-97fe-7c7c06a25dfb req-3052a06e-a461-424f-914c-41c6fc657d34 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.379 187256 DEBUG nova.compute.manager [req-3e53b4ff-4539-4dde-97fe-7c7c06a25dfb req-3052a06e-a461-424f-914c-41c6fc657d34 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] No waiting events found dispatching network-vif-unplugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.379 187256 WARNING nova.compute.manager [req-3e53b4ff-4539-4dde-97fe-7c7c06a25dfb req-3052a06e-a461-424f-914c-41c6fc657d34 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received unexpected event network-vif-unplugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.405 187256 DEBUG nova.compute.provider_tree [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.418 187256 DEBUG nova.scheduler.client.report [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.441 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.472 187256 INFO nova.scheduler.client.report [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Deleted allocations for instance ecbea330-ccac-4a01-a80b-0c10a2f686e2#033[00m
Nov 28 11:27:51 np0005538960 nova_compute[187252]: 2025-11-28 16:27:51.539 187256 DEBUG oslo_concurrency.lockutils [None req-183d7631-a212-4649-af16-aeafa872885d a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:52 np0005538960 nova_compute[187252]: 2025-11-28 16:27:52.026 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:52 np0005538960 nova_compute[187252]: 2025-11-28 16:27:52.045 187256 DEBUG nova.compute.manager [req-c7211216-9634-4df6-8cfe-ac5dd78d4260 req-cd326c38-8e6c-4cf5-945b-a20d9e31b823 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-deleted-ccc62a21-60d5-4151-8ab5-c33149100cd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.034 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.145 187256 DEBUG nova.network.neutron [req-883f881e-8ca8-4816-8b67-d1c43c24f80a req-b05a8400-6939-4e11-9bf9-58e7d0c2ef66 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updated VIF entry in instance network info cache for port ccc62a21-60d5-4151-8ab5-c33149100cd0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.146 187256 DEBUG nova.network.neutron [req-883f881e-8ca8-4816-8b67-d1c43c24f80a req-b05a8400-6939-4e11-9bf9-58e7d0c2ef66 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Updating instance_info_cache with network_info: [{"id": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "address": "fa:16:3e:a3:d6:a7", "network": {"id": "577c0581-66ed-41fb-8a29-0e25a0007ac2", "bridge": "br-int", "label": "tempest-network-smoke--2058919722", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapccc62a21-60", "ovs_interfaceid": "ccc62a21-60d5-4151-8ab5-c33149100cd0", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.173 187256 DEBUG oslo_concurrency.lockutils [req-883f881e-8ca8-4816-8b67-d1c43c24f80a req-b05a8400-6939-4e11-9bf9-58e7d0c2ef66 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-ecbea330-ccac-4a01-a80b-0c10a2f686e2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.308 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.467 187256 DEBUG nova.compute.manager [req-0dc331e4-692d-4256-b865-31cda0a46698 req-3afba867-dea0-4f1b-9c96-4c1cbfdbbbe4 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received event network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.468 187256 DEBUG oslo_concurrency.lockutils [req-0dc331e4-692d-4256-b865-31cda0a46698 req-3afba867-dea0-4f1b-9c96-4c1cbfdbbbe4 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.468 187256 DEBUG oslo_concurrency.lockutils [req-0dc331e4-692d-4256-b865-31cda0a46698 req-3afba867-dea0-4f1b-9c96-4c1cbfdbbbe4 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.468 187256 DEBUG oslo_concurrency.lockutils [req-0dc331e4-692d-4256-b865-31cda0a46698 req-3afba867-dea0-4f1b-9c96-4c1cbfdbbbe4 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "ecbea330-ccac-4a01-a80b-0c10a2f686e2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.468 187256 DEBUG nova.compute.manager [req-0dc331e4-692d-4256-b865-31cda0a46698 req-3afba867-dea0-4f1b-9c96-4c1cbfdbbbe4 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] No waiting events found dispatching network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:27:53 np0005538960 nova_compute[187252]: 2025-11-28 16:27:53.468 187256 WARNING nova.compute.manager [req-0dc331e4-692d-4256-b865-31cda0a46698 req-3afba867-dea0-4f1b-9c96-4c1cbfdbbbe4 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Received unexpected event network-vif-plugged-ccc62a21-60d5-4151-8ab5-c33149100cd0 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:27:55 np0005538960 nova_compute[187252]: 2025-11-28 16:27:55.490 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:56 np0005538960 nova_compute[187252]: 2025-11-28 16:27:56.422 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:58 np0005538960 nova_compute[187252]: 2025-11-28 16:27:58.081 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:58 np0005538960 podman[219787]: 2025-11-28 16:27:58.170077589 +0000 UTC m=+0.062042847 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm)
Nov 28 11:27:59 np0005538960 nova_compute[187252]: 2025-11-28 16:27:59.709 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:27:59 np0005538960 nova_compute[187252]: 2025-11-28 16:27:59.893 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:00 np0005538960 nova_compute[187252]: 2025-11-28 16:28:00.494 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:02 np0005538960 podman[219811]: 2025-11-28 16:28:02.181311635 +0000 UTC m=+0.081816815 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:28:03 np0005538960 nova_compute[187252]: 2025-11-28 16:28:03.084 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.033 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.033 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.061 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.324 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.324 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.330 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.330 187256 INFO nova.compute.claims [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.597 187256 DEBUG nova.compute.provider_tree [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.629 187256 DEBUG nova.scheduler.client.report [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.674 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.675 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.746 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.746 187256 DEBUG nova.network.neutron [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.766 187256 INFO nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.788 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.888 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.889 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.890 187256 INFO nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Creating image(s)#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.890 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.891 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.891 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.904 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.965 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.966 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.967 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:04 np0005538960 nova_compute[187252]: 2025-11-28 16:28:04.983 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.045 187256 DEBUG nova.policy [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.048 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.049 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.107 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk 1073741824" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.108 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.108 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.188 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.190 187256 DEBUG nova.virt.disk.api [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Checking if we can resize image /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.190 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.259 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.260 187256 DEBUG nova.virt.disk.api [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Cannot resize image /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.261 187256 DEBUG nova.objects.instance [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'migration_context' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.278 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.278 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Ensure instance console log exists: /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.279 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.279 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.279 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.467 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347270.4664602, ecbea330-ccac-4a01-a80b-0c10a2f686e2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.468 187256 INFO nova.compute.manager [-] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.488 187256 DEBUG nova.compute.manager [None req-1ec1d7e7-71b3-449a-b28a-a8d6b289558f - - - - - -] [instance: ecbea330-ccac-4a01-a80b-0c10a2f686e2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.497 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:05 np0005538960 nova_compute[187252]: 2025-11-28 16:28:05.903 187256 DEBUG nova.network.neutron [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Successfully created port: 1a93323f-5f62-458a-b2d5-7d1745ecf9aa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:28:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:06.348 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:06.348 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:06.349 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.086 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:08 np0005538960 podman[219850]: 2025-11-28 16:28:08.203031949 +0000 UTC m=+0.105278115 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.237 187256 DEBUG nova.network.neutron [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Successfully updated port: 1a93323f-5f62-458a-b2d5-7d1745ecf9aa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.253 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.253 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.254 187256 DEBUG nova.network.neutron [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.510 187256 DEBUG nova.network.neutron [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.888 187256 DEBUG nova.compute.manager [req-d654cb55-8771-4383-b1d8-238310ea4873 req-73bab798-2701-4851-a75f-42174813fffc 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-changed-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.888 187256 DEBUG nova.compute.manager [req-d654cb55-8771-4383-b1d8-238310ea4873 req-73bab798-2701-4851-a75f-42174813fffc 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Refreshing instance network info cache due to event network-changed-1a93323f-5f62-458a-b2d5-7d1745ecf9aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:28:08 np0005538960 nova_compute[187252]: 2025-11-28 16:28:08.888 187256 DEBUG oslo_concurrency.lockutils [req-d654cb55-8771-4383-b1d8-238310ea4873 req-73bab798-2701-4851-a75f-42174813fffc 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:28:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:09.066 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:22:a7:fb 2001:db8:0:1:f816:3eff:fe22:a7fb 2001:db8::f816:3eff:fe22:a7fb'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe22:a7fb/64 2001:db8::f816:3eff:fe22:a7fb/64', 'neutron:device_id': 'ovnmeta-667228a8-0d34-46bf-bae4-60029111a587', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667228a8-0d34-46bf-bae4-60029111a587', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1339bd99-bfc0-4236-8d6b-01aa3c8321c0, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=9514c506-92c6-4a98-ae6f-974d91d786ad) old=Port_Binding(mac=['fa:16:3e:22:a7:fb 2001:db8::f816:3eff:fe22:a7fb'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe22:a7fb/64', 'neutron:device_id': 'ovnmeta-667228a8-0d34-46bf-bae4-60029111a587', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-667228a8-0d34-46bf-bae4-60029111a587', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:28:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:09.067 104369 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 9514c506-92c6-4a98-ae6f-974d91d786ad in datapath 667228a8-0d34-46bf-bae4-60029111a587 updated#033[00m
Nov 28 11:28:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:09.069 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 667228a8-0d34-46bf-bae4-60029111a587, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:28:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:09.070 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[288f4730-d12a-48db-862e-d091c73b2823]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.740 187256 DEBUG nova.network.neutron [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating instance_info_cache with network_info: [{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.863 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.864 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance network_info: |[{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.864 187256 DEBUG oslo_concurrency.lockutils [req-d654cb55-8771-4383-b1d8-238310ea4873 req-73bab798-2701-4851-a75f-42174813fffc 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.864 187256 DEBUG nova.network.neutron [req-d654cb55-8771-4383-b1d8-238310ea4873 req-73bab798-2701-4851-a75f-42174813fffc 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Refreshing network info cache for port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.867 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Start _get_guest_xml network_info=[{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.871 187256 WARNING nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.879 187256 DEBUG nova.virt.libvirt.host [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.880 187256 DEBUG nova.virt.libvirt.host [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.887 187256 DEBUG nova.virt.libvirt.host [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.888 187256 DEBUG nova.virt.libvirt.host [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.889 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.890 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.890 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.890 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.890 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.891 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.891 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.891 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.891 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.892 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.892 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.892 187256 DEBUG nova.virt.hardware [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.896 187256 DEBUG nova.virt.libvirt.vif [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1797102267',display_name='tempest-TestNetworkAdvancedServerOps-server-1797102267',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1797102267',id=34,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJWXQNoq9i14fJ2oMEnvHKoj5z47pddC9rW89mHqqhCU86I3zCb06Mwj3vaxCF0TkOfIjQvY2u8ugQzxEW+sF51eYGiPUBoZyEa/18LzqhM/C6ulPxonirUoF5gi5rGeSw==',key_name='tempest-TestNetworkAdvancedServerOps-1947953276',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-cjqomwv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:28:04Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=c4c0b9e1-fd20-4bc8-b105-53c8be08942f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.896 187256 DEBUG nova.network.os_vif_util [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.897 187256 DEBUG nova.network.os_vif_util [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.898 187256 DEBUG nova.objects.instance [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'pci_devices' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.911 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <uuid>c4c0b9e1-fd20-4bc8-b105-53c8be08942f</uuid>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <name>instance-00000022</name>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1797102267</nova:name>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:28:09</nova:creationTime>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        <nova:user uuid="5d381eba17324dd5ad798648b82d0115">tempest-TestNetworkAdvancedServerOps-762685809-project-member</nova:user>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        <nova:project uuid="7e408bace48b41a1ac0677d300b6d288">tempest-TestNetworkAdvancedServerOps-762685809</nova:project>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        <nova:port uuid="1a93323f-5f62-458a-b2d5-7d1745ecf9aa">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <entry name="serial">c4c0b9e1-fd20-4bc8-b105-53c8be08942f</entry>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <entry name="uuid">c4c0b9e1-fd20-4bc8-b105-53c8be08942f</entry>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:c0:5c:b3"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <target dev="tap1a93323f-5f"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/console.log" append="off"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:28:09 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:28:09 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:28:09 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:28:09 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.913 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Preparing to wait for external event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.913 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.913 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.913 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.914 187256 DEBUG nova.virt.libvirt.vif [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1797102267',display_name='tempest-TestNetworkAdvancedServerOps-server-1797102267',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1797102267',id=34,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJWXQNoq9i14fJ2oMEnvHKoj5z47pddC9rW89mHqqhCU86I3zCb06Mwj3vaxCF0TkOfIjQvY2u8ugQzxEW+sF51eYGiPUBoZyEa/18LzqhM/C6ulPxonirUoF5gi5rGeSw==',key_name='tempest-TestNetworkAdvancedServerOps-1947953276',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-cjqomwv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:28:04Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=c4c0b9e1-fd20-4bc8-b105-53c8be08942f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.914 187256 DEBUG nova.network.os_vif_util [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.915 187256 DEBUG nova.network.os_vif_util [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.916 187256 DEBUG os_vif [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.917 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.917 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.918 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.920 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.920 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a93323f-5f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.921 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1a93323f-5f, col_values=(('external_ids', {'iface-id': '1a93323f-5f62-458a-b2d5-7d1745ecf9aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:5c:b3', 'vm-uuid': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.922 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:09 np0005538960 NetworkManager[55548]: <info>  [1764347289.9236] manager: (tap1a93323f-5f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.926 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.929 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:09 np0005538960 nova_compute[187252]: 2025-11-28 16:28:09.930 187256 INFO os_vif [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f')#033[00m
Nov 28 11:28:10 np0005538960 nova_compute[187252]: 2025-11-28 16:28:10.001 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:28:10 np0005538960 nova_compute[187252]: 2025-11-28 16:28:10.001 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:28:10 np0005538960 nova_compute[187252]: 2025-11-28 16:28:10.002 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No VIF found with MAC fa:16:3e:c0:5c:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:28:10 np0005538960 nova_compute[187252]: 2025-11-28 16:28:10.002 187256 INFO nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Using config drive#033[00m
Nov 28 11:28:11 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:11.880 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:28:11 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:11.880 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:28:11 np0005538960 nova_compute[187252]: 2025-11-28 16:28:11.881 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:12 np0005538960 podman[219879]: 2025-11-28 16:28:12.165211863 +0000 UTC m=+0.070549188 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 11:28:12 np0005538960 podman[219880]: 2025-11-28 16:28:12.190331022 +0000 UTC m=+0.093213691 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:28:12 np0005538960 nova_compute[187252]: 2025-11-28 16:28:12.868 187256 INFO nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Creating config drive at /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config#033[00m
Nov 28 11:28:12 np0005538960 nova_compute[187252]: 2025-11-28 16:28:12.873 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkeaf0sf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.000 187256 DEBUG oslo_concurrency.processutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkkeaf0sf" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:13 np0005538960 kernel: tap1a93323f-5f: entered promiscuous mode
Nov 28 11:28:13 np0005538960 NetworkManager[55548]: <info>  [1764347293.0747] manager: (tap1a93323f-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Nov 28 11:28:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:13Z|00152|binding|INFO|Claiming lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa for this chassis.
Nov 28 11:28:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:13Z|00153|binding|INFO|1a93323f-5f62-458a-b2d5-7d1745ecf9aa: Claiming fa:16:3e:c0:5c:b3 10.100.0.6
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.076 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.080 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.083 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.099 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:5c:b3 10.100.0.6'], port_security=['fa:16:3e:c0:5c:b3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ce57eab6-06a0-474d-a6ea-12656babb7ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1984d677-5f2f-49ac-a5f9-2343abc938a7, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=1a93323f-5f62-458a-b2d5-7d1745ecf9aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.100 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa in datapath 6b8cb1a4-1232-45b7-a54f-e85635df6a5a bound to our chassis#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.101 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b8cb1a4-1232-45b7-a54f-e85635df6a5a#033[00m
Nov 28 11:28:13 np0005538960 systemd-udevd[219935]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.114 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ed7977d1-7b3d-4482-a66e-93bd6e6788ef]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.115 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b8cb1a4-11 in ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.117 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b8cb1a4-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.118 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[be473619-a8a4-4832-bd12-9885e05fb4b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.119 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[72309ece-d3f2-46e8-9e20-a627a09f1381]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 systemd-machined[153518]: New machine qemu-11-instance-00000022.
Nov 28 11:28:13 np0005538960 NetworkManager[55548]: <info>  [1764347293.1269] device (tap1a93323f-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:28:13 np0005538960 NetworkManager[55548]: <info>  [1764347293.1282] device (tap1a93323f-5f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.134 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.135 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[e3a28919-0f8b-4a0c-9da0-4631eda3fde1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.137 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:13 np0005538960 systemd[1]: Started Virtual Machine qemu-11-instance-00000022.
Nov 28 11:28:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:13Z|00154|binding|INFO|Setting lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa ovn-installed in OVS
Nov 28 11:28:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:13Z|00155|binding|INFO|Setting lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa up in Southbound
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.143 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.151 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e869d8-67f8-4afe-badb-94872ade2e05]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.189 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[91247c9f-2a26-41f5-8a3e-eadfd648999b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 systemd-udevd[219940]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:28:13 np0005538960 NetworkManager[55548]: <info>  [1764347293.1995] manager: (tap6b8cb1a4-10): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.198 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[726f6a9b-c5c5-4f94-a306-d40ef6465b3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.237 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[8e5c5dab-2993-4016-bee7-5e66d4b4bd28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.240 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[8b96179e-9819-4302-839b-4bc58c01ef29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 NetworkManager[55548]: <info>  [1764347293.2714] device (tap6b8cb1a4-10): carrier: link connected
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.275 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[bdcda7dd-fd16-4dfa-b50f-7eca42eb8df5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.295 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[14feb0b8-76d5-434e-ac47-59ba5dc431a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b8cb1a4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:56:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439285, 'reachable_time': 39064, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219970, 'error': None, 'target': 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.312 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7a7c31da-ccf1-4de5-b2fc-b0374a08df9b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:5618'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439285, 'tstamp': 439285}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219971, 'error': None, 'target': 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.334 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5bc5ba40-0259-4b49-a02a-432d3d480d33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b8cb1a4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:56:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 3, 'tx_packets': 1, 'rx_bytes': 266, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439285, 'reachable_time': 39064, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 3, 'inoctets': 224, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 3, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 224, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 3, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219972, 'error': None, 'target': 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.370 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[09bdab20-605f-4c52-8238-a24e3f5c761d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.436 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[eec6c093-b1f2-43c0-a55d-3ab57ef52684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.438 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b8cb1a4-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.438 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.439 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b8cb1a4-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:13 np0005538960 kernel: tap6b8cb1a4-10: entered promiscuous mode
Nov 28 11:28:13 np0005538960 NetworkManager[55548]: <info>  [1764347293.4421] manager: (tap6b8cb1a4-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.446 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b8cb1a4-10, col_values=(('external_ids', {'iface-id': '2258088d-2e87-495c-97b9-2f7fdc663c51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:13 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:13Z|00156|binding|INFO|Releasing lport 2258088d-2e87-495c-97b9-2f7fdc663c51 from this chassis (sb_readonly=0)
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.449 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b8cb1a4-1232-45b7-a54f-e85635df6a5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b8cb1a4-1232-45b7-a54f-e85635df6a5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.450 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f73cd7-c52c-45cf-aba6-e6db68212c90]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.451 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-6b8cb1a4-1232-45b7-a54f-e85635df6a5a
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/6b8cb1a4-1232-45b7-a54f-e85635df6a5a.pid.haproxy
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID 6b8cb1a4-1232-45b7-a54f-e85635df6a5a
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:28:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:13.452 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'env', 'PROCESS_TAG=haproxy-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b8cb1a4-1232-45b7-a54f-e85635df6a5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.459 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.463 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:13 np0005538960 podman[220004]: 2025-11-28 16:28:13.854958225 +0000 UTC m=+0.055824865 container create e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:28:13 np0005538960 systemd[1]: Started libpod-conmon-e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396.scope.
Nov 28 11:28:13 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.911 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347293.9109704, c4c0b9e1-fd20-4bc8-b105-53c8be08942f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.912 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] VM Started (Lifecycle Event)#033[00m
Nov 28 11:28:13 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/05109c14401fac5fc143dad3f68beb4346853578ad96ac42f48b56c00fe127ad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:28:13 np0005538960 podman[220004]: 2025-11-28 16:28:13.826178834 +0000 UTC m=+0.027045494 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.928 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:28:13 np0005538960 podman[220004]: 2025-11-28 16:28:13.931290505 +0000 UTC m=+0.132157165 container init e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.932 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347293.911091, c4c0b9e1-fd20-4bc8-b105-53c8be08942f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.932 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:28:13 np0005538960 podman[220004]: 2025-11-28 16:28:13.937686639 +0000 UTC m=+0.138553269 container start e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.948 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.952 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:28:13 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220026]: [NOTICE]   (220030) : New worker (220032) forked
Nov 28 11:28:13 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220026]: [NOTICE]   (220030) : Loading success.
Nov 28 11:28:13 np0005538960 nova_compute[187252]: 2025-11-28 16:28:13.968 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:28:14 np0005538960 nova_compute[187252]: 2025-11-28 16:28:14.923 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.128 187256 DEBUG nova.network.neutron [req-d654cb55-8771-4383-b1d8-238310ea4873 req-73bab798-2701-4851-a75f-42174813fffc 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updated VIF entry in instance network info cache for port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.129 187256 DEBUG nova.network.neutron [req-d654cb55-8771-4383-b1d8-238310ea4873 req-73bab798-2701-4851-a75f-42174813fffc 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating instance_info_cache with network_info: [{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.159 187256 DEBUG oslo_concurrency.lockutils [req-d654cb55-8771-4383-b1d8-238310ea4873 req-73bab798-2701-4851-a75f-42174813fffc 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.225 187256 DEBUG nova.compute.manager [req-b1a790a8-166a-4372-be9d-fb1e24a99e70 req-a3286752-ac5a-452e-be66-16e2f8251870 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.226 187256 DEBUG oslo_concurrency.lockutils [req-b1a790a8-166a-4372-be9d-fb1e24a99e70 req-a3286752-ac5a-452e-be66-16e2f8251870 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.226 187256 DEBUG oslo_concurrency.lockutils [req-b1a790a8-166a-4372-be9d-fb1e24a99e70 req-a3286752-ac5a-452e-be66-16e2f8251870 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.226 187256 DEBUG oslo_concurrency.lockutils [req-b1a790a8-166a-4372-be9d-fb1e24a99e70 req-a3286752-ac5a-452e-be66-16e2f8251870 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.226 187256 DEBUG nova.compute.manager [req-b1a790a8-166a-4372-be9d-fb1e24a99e70 req-a3286752-ac5a-452e-be66-16e2f8251870 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Processing event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.227 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.230 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347295.230538, c4c0b9e1-fd20-4bc8-b105-53c8be08942f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.231 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.232 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.236 187256 INFO nova.virt.libvirt.driver [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance spawned successfully.#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.236 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.262 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.266 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.267 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.267 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.267 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.268 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.268 187256 DEBUG nova.virt.libvirt.driver [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.273 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.303 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.340 187256 INFO nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Took 10.45 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.341 187256 DEBUG nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.408 187256 INFO nova.compute.manager [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Took 11.11 seconds to build instance.#033[00m
Nov 28 11:28:15 np0005538960 nova_compute[187252]: 2025-11-28 16:28:15.429 187256 DEBUG oslo_concurrency.lockutils [None req-0b11f8c3-e8f6-485d-ae38-60190a23a9a4 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:17 np0005538960 podman[220041]: 2025-11-28 16:28:17.169504758 +0000 UTC m=+0.067099091 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:28:18 np0005538960 nova_compute[187252]: 2025-11-28 16:28:18.140 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:18 np0005538960 nova_compute[187252]: 2025-11-28 16:28:18.346 187256 DEBUG nova.compute.manager [req-2a3697fe-6095-4bd3-9c99-cbb7719a76c1 req-c362ad09-0993-4002-8876-8d9b8ea19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:28:18 np0005538960 nova_compute[187252]: 2025-11-28 16:28:18.346 187256 DEBUG oslo_concurrency.lockutils [req-2a3697fe-6095-4bd3-9c99-cbb7719a76c1 req-c362ad09-0993-4002-8876-8d9b8ea19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:18 np0005538960 nova_compute[187252]: 2025-11-28 16:28:18.346 187256 DEBUG oslo_concurrency.lockutils [req-2a3697fe-6095-4bd3-9c99-cbb7719a76c1 req-c362ad09-0993-4002-8876-8d9b8ea19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:18 np0005538960 nova_compute[187252]: 2025-11-28 16:28:18.347 187256 DEBUG oslo_concurrency.lockutils [req-2a3697fe-6095-4bd3-9c99-cbb7719a76c1 req-c362ad09-0993-4002-8876-8d9b8ea19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:18 np0005538960 nova_compute[187252]: 2025-11-28 16:28:18.347 187256 DEBUG nova.compute.manager [req-2a3697fe-6095-4bd3-9c99-cbb7719a76c1 req-c362ad09-0993-4002-8876-8d9b8ea19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] No waiting events found dispatching network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:28:18 np0005538960 nova_compute[187252]: 2025-11-28 16:28:18.347 187256 WARNING nova.compute.manager [req-2a3697fe-6095-4bd3-9c99-cbb7719a76c1 req-c362ad09-0993-4002-8876-8d9b8ea19585 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received unexpected event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa for instance with vm_state active and task_state None.#033[00m
Nov 28 11:28:19 np0005538960 nova_compute[187252]: 2025-11-28 16:28:19.925 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.312 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:20 np0005538960 NetworkManager[55548]: <info>  [1764347300.3136] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Nov 28 11:28:20 np0005538960 NetworkManager[55548]: <info>  [1764347300.3143] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.317 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.320 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.475 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:20Z|00157|binding|INFO|Releasing lport 2258088d-2e87-495c-97b9-2f7fdc663c51 from this chassis (sb_readonly=0)
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.501 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:20.882 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.918 187256 DEBUG nova.compute.manager [req-34372a49-b5c3-4b89-b3b5-95a0dbd374fb req-a9d7c120-9df8-40e2-b51b-66b6152d0511 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-changed-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.919 187256 DEBUG nova.compute.manager [req-34372a49-b5c3-4b89-b3b5-95a0dbd374fb req-a9d7c120-9df8-40e2-b51b-66b6152d0511 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Refreshing instance network info cache due to event network-changed-1a93323f-5f62-458a-b2d5-7d1745ecf9aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.919 187256 DEBUG oslo_concurrency.lockutils [req-34372a49-b5c3-4b89-b3b5-95a0dbd374fb req-a9d7c120-9df8-40e2-b51b-66b6152d0511 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.920 187256 DEBUG oslo_concurrency.lockutils [req-34372a49-b5c3-4b89-b3b5-95a0dbd374fb req-a9d7c120-9df8-40e2-b51b-66b6152d0511 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:28:20 np0005538960 nova_compute[187252]: 2025-11-28 16:28:20.920 187256 DEBUG nova.network.neutron [req-34372a49-b5c3-4b89-b3b5-95a0dbd374fb req-a9d7c120-9df8-40e2-b51b-66b6152d0511 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Refreshing network info cache for port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:28:21 np0005538960 podman[220066]: 2025-11-28 16:28:21.160039385 +0000 UTC m=+0.064355249 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Nov 28 11:28:22 np0005538960 nova_compute[187252]: 2025-11-28 16:28:22.317 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.141 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.394 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.396 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.396 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.396 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.501 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.565 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.567 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.627 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.790 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.792 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5614MB free_disk=73.33716201782227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.792 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.793 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.857 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance c4c0b9e1-fd20-4bc8-b105-53c8be08942f actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.858 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.858 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.899 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.916 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.950 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:28:23 np0005538960 nova_compute[187252]: 2025-11-28 16:28:23.951 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.258 187256 DEBUG nova.network.neutron [req-34372a49-b5c3-4b89-b3b5-95a0dbd374fb req-a9d7c120-9df8-40e2-b51b-66b6152d0511 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updated VIF entry in instance network info cache for port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.259 187256 DEBUG nova.network.neutron [req-34372a49-b5c3-4b89-b3b5-95a0dbd374fb req-a9d7c120-9df8-40e2-b51b-66b6152d0511 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating instance_info_cache with network_info: [{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.280 187256 DEBUG oslo_concurrency.lockutils [req-34372a49-b5c3-4b89-b3b5-95a0dbd374fb req-a9d7c120-9df8-40e2-b51b-66b6152d0511 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.927 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.946 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.947 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.947 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.969 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.969 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:24 np0005538960 nova_compute[187252]: 2025-11-28 16:28:24.970 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:27 np0005538960 nova_compute[187252]: 2025-11-28 16:28:27.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:28:28 np0005538960 nova_compute[187252]: 2025-11-28 16:28:28.143 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:29 np0005538960 podman[220108]: 2025-11-28 16:28:29.155901799 +0000 UTC m=+0.062748532 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Nov 28 11:28:29 np0005538960 nova_compute[187252]: 2025-11-28 16:28:29.930 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:30Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c0:5c:b3 10.100.0.6
Nov 28 11:28:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:30Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:5c:b3 10.100.0.6
Nov 28 11:28:33 np0005538960 nova_compute[187252]: 2025-11-28 16:28:33.145 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:33 np0005538960 podman[220129]: 2025-11-28 16:28:33.170061568 +0000 UTC m=+0.075962705 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:28:34 np0005538960 nova_compute[187252]: 2025-11-28 16:28:34.933 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.313 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000022', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '7e408bace48b41a1ac0677d300b6d288', 'user_id': '5d381eba17324dd5ad798648b82d0115', 'hostId': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.314 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.317 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c4c0b9e1-fd20-4bc8-b105-53c8be08942f / tap1a93323f-5f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.318 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '118e70f8-5286-4ee2-875b-33e0d08cd827', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.314722', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '49678f48-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': '66322970ad2f160b7907ab8e2f075007199a9919b8b9f6eb1a441a2973215ffd'}]}, 'timestamp': '2025-11-28 16:28:35.318560', '_unique_id': 'a71c7eb07d50431a9d1d83f880a786f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.319 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.320 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.320 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '746af16e-ebac-48be-be66-d9c8af8e799a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.320803', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '4967f60e-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': '7aff8f6cf616b6ae4faf1c33e27fa51bff9160a60d6a66ab04e2cb836f5c381c'}]}, 'timestamp': '2025-11-28 16:28:35.321142', '_unique_id': '8d88b9e2e02e47049694128885e7ec37'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.321 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.322 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.336 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/memory.usage volume: 40.43359375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e3d4ce3c-0f99-4afb-9ac7-244d676912d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.43359375, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'timestamp': '2025-11-28T16:28:35.322520', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '496a4f30-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.970090145, 'message_signature': 'b553f467d894cfd305f9a454bdc725fe2e794c202de5bcb3d099fa075535e1d3'}]}, 'timestamp': '2025-11-28 16:28:35.336586', '_unique_id': '073f93bcdeb642af8ba3d3133b67e3ee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.337 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.338 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.361 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.362 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e7a05c2c-a6f0-4a3d-a03c-6bfd636642c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.338355', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '496e4c98-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': 'b23d807a36bac2699ea4568b2106325338bd4bdb2388a0c4f4eee7b96d2b38c1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.338355', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '496e59cc-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': '36142d5686ea89ee59821501174e1ed0de2d6d0de3590d894f8695984e7a53c7'}]}, 'timestamp': '2025-11-28 16:28:35.363032', '_unique_id': '1eee001625dc408297b3638a7ca5038c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.364 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.373 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.374 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '653c2b60-e919-4e70-9f36-d60dc4322012', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.365058', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '497018c0-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.999248996, 'message_signature': '45aec512f6edeb919981388a527af79f3784ff06cbc3cd373cdbbe3ee523e5bd'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.365058', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49702572-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.999248996, 'message_signature': 'edfda75365ad4aba50cbd16f69a0601644aebd93bad27612d9e3f1f040304b16'}]}, 'timestamp': '2025-11-28 16:28:35.374984', '_unique_id': 'f085d32b2de245e88af69af77bd8e055'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.375 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.376 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.376 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.read.bytes volume: 30206464 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.377 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b5ed650-92e0-49a2-ac82-ccf66942add1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30206464, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.376735', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49707f22-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': '12b8e1002d281039ce343ff7f243f0128fe48a752058adfeafcfe246e0fdd982'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.376735', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49708ada-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': '5536634c604ac63c5093155b2eb1e19dfe7c60fa3dcad5c18865cd0a5f76c97a'}]}, 'timestamp': '2025-11-28 16:28:35.377368', '_unique_id': '8b5a9f9bf4a3403798e875302e8acdb6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.378 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.write.bytes volume: 72892416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.379 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '921e71b2-93bc-4385-8e43-9c136566f940', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72892416, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.378917', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4970d38c-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': '101e5982eea6145d35583607fbc63964bc11b78ec0a3c2e1b351ac3a6196a252'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.378917', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4970deea-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': 'd56cf2b9ad56a49b4c200a9c060bb334cd08f15d89466b92cfaf3673a28215d3'}]}, 'timestamp': '2025-11-28 16:28:35.379519', '_unique_id': 'e337fc2fb0254c668693df600bd59fc0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.380 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.381 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.write.requests volume: 302 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.381 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3ed0ca2a-5227-4377-9d8a-445a4e0b4800', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 302, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.381038', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49712620-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': 'e392df1b79143e60ab9a6aa9bb4fceea05c628dc5138216be525912ab1ce8aef'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.381038', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49713174-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': 'fca626819f6bfdd60733f03b1bdd6e23c4d1cf772d3604e6c3216eb2d4b9fac8'}]}, 'timestamp': '2025-11-28 16:28:35.381632', '_unique_id': '67e2e13f9e1b46e790f79eec037d4849'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.382 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.383 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.383 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.383 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a400cc2-52c6-4fd1-9755-acb9676f2424', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.383162', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49717904-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.999248996, 'message_signature': 'ab9fd734b053106830744d95e1638dc0e00517f8f97c50d5e5f5f725ab3b5b21'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.383162', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4971846c-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.999248996, 'message_signature': '55d89c696b45a9d43b60e1f6d4f5b85c1ac43a8f02c322ba13f14af98ec05228'}]}, 'timestamp': '2025-11-28 16:28:35.383757', '_unique_id': 'fcab307f216844cd87db4f88f8cfcc18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.384 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.385 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.385 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.outgoing.bytes volume: 1396 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c9a1d4a2-5f7c-4cfd-bb32-5e5698b6ef7a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1396, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.385443', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '4971d2be-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': '0194d577e1afa25ef9f9f4a55b8001ef07d550e8c7464c94a948d890e6ecf58f'}]}, 'timestamp': '2025-11-28 16:28:35.385784', '_unique_id': '765e1e4b6774408e84edf49bac207b05'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.387 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.387 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.read.latency volume: 173707603 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.387 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.read.latency volume: 23079096 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4f817926-9ee0-4cb3-a60d-1e82e0c43f1d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 173707603, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.387282', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '497219f4-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': 'b701e4f314ac911b6ed119b7a1dfaab275c9e8b282b1ade0d2dcbef35d4899d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 23079096, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.387282', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4972253e-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': '1c7d8d1281b900186af7a16555ea9b215bc3c6103ac23b2586840a9d310d2a75'}]}, 'timestamp': '2025-11-28 16:28:35.388032', '_unique_id': '86717b907cb24d368bf24020b4fb6917'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.388 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.389 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.389 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.write.latency volume: 2849043988 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.389 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2acca05f-cb46-4c39-a854-216d7102dd44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2849043988, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.389518', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '4972714c-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': 'dc0d36a5ac6e8cecc03a22cb1fcf6382910d3aea07439ae8c0dcd1ae14e733fc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.389518', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '49727d9a-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.972514174, 'message_signature': '990fa2fc25f90e7a3d4e45397695fd2d999e0aeeb673f32fe3c1588fe190ccc6'}]}, 'timestamp': '2025-11-28 16:28:35.390135', '_unique_id': 'b816181193894a7ca8d7e6dae1debb3b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.390 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.391 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.391 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.outgoing.packets volume: 12 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '018ec35e-f5fc-4f6b-a02c-b5388eb7c493', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 12, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.391653', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '4972c4da-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': '387bf075ecd0bfb4faad184d8d8754bd4b371a3284cabe1adf9cfb1e298569b2'}]}, 'timestamp': '2025-11-28 16:28:35.391996', '_unique_id': 'cf8c52055f7642819699fbf7e9d56935'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.392 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.393 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.393 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '13a823ad-d3e9-4a52-b80d-dc295c03d60a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.393441', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '49730a9e-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': '7ad1849a1d22e99118940b7409fb9da295cd05ca275e57f8287eb451c030bce6'}]}, 'timestamp': '2025-11-28 16:28:35.393761', '_unique_id': '09b0651dab23401397303c6596d5bebc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.395 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.395 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.incoming.bytes volume: 1940 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4c853df4-9ccf-43fd-bd08-6d12af619fc4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1940, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.395256', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '4973517a-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': '8bf9041670a4051f0e10c4cb10af77b70bd22b55d94309ca451284a0ba22475a'}]}, 'timestamp': '2025-11-28 16:28:35.395597', '_unique_id': '9bb7be97f16e408d85aa9c84e1a4fb24'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.396 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.397 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.397 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6654742e-8c13-4757-9016-3644d60a3d56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-vda', 'timestamp': '2025-11-28T16:28:35.397070', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '49739838-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.999248996, 'message_signature': '3b4ea3b4bd31e9d3eff2164036fc7686635d3237610d3f118f200be90600fff9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f-sda', 'timestamp': '2025-11-28T16:28:35.397070', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '4973a404-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.999248996, 'message_signature': '4d13a1e4b25cf4c2b2e0f4706089dc8fd934494fd98f16fb81ec78a5f63944b1'}]}, 'timestamp': '2025-11-28 16:28:35.397673', '_unique_id': '5fe9e3d69f9943caa4b689e5a9c7d5f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.399 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.399 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1797102267>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1797102267>]
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.399 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.399 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.399 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1797102267>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1797102267>]
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.400 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.400 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.400 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1797102267>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1797102267>]
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.400 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.400 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ecf32f5a-b31f-4c00-95ed-778f9a2f30ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.400578', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '49742172-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': '91fa42c3c66eae75509436d55eafc64ba976ea4558f2cbf7d69a2cd1fae1f325'}]}, 'timestamp': '2025-11-28 16:28:35.400920', '_unique_id': 'e9c25769d40a4b2c9e6e050304591e4f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.402 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.402 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.402 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1797102267>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1797102267>]
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.402 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c406d05-9b51-40c3-9882-76c70700cebb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.402999', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '4974800e-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': 'a05a68443b242ab82a71466ba8e9c1a8696adf4258c03842aa0de4bde95a252e'}]}, 'timestamp': '2025-11-28 16:28:35.403320', '_unique_id': 'fbc9348204f94966a9293321321858d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.404 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.404 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0e370861-36f3-432d-927f-cc53e347cad0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.404795', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '4974c6d6-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': 'c6410037c98ad9b3eac6142a6fff3234894a49d1d882abbe783480644a4c8e0a'}]}, 'timestamp': '2025-11-28 16:28:35.405133', '_unique_id': 'a423303f361c4e0ab88137fa40b6b0cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.406 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.406 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32e6c05d-0a93-4f9b-a27e-88ac30d6d7ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'instance-00000022-c4c0b9e1-fd20-4bc8-b105-53c8be08942f-tap1a93323f-5f', 'timestamp': '2025-11-28T16:28:35.406642', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'tap1a93323f-5f', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c0:5c:b3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap1a93323f-5f'}, 'message_id': '49750e20-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.948878127, 'message_signature': '8d393d506be1fb16210deaad2d7f8049f999d2b56ae63bbd383f4782d3ff2bce'}]}, 'timestamp': '2025-11-28 16:28:35.406971', '_unique_id': '82600c0c763149c983baa69685cae9f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.407 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 DEBUG ceilometer.compute.pollsters [-] c4c0b9e1-fd20-4bc8-b105-53c8be08942f/cpu volume: 12200000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a9764521-c62b-4bd9-a6c1-c80061ab47b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12200000000, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_name': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_name': None, 'resource_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'timestamp': '2025-11-28T16:28:35.408083', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1797102267', 'name': 'instance-00000022', 'instance_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'instance_type': 'm1.nano', 'host': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '49754610-cc77-11f0-bcca-fa163efe7585', 'monotonic_time': 4414.970090145, 'message_signature': '634a4f1c55165efa0873edc1da683fd3821a7b9260941e72f041b987d8b412f4'}]}, 'timestamp': '2025-11-28 16:28:35.408374', '_unique_id': '77e9167d61014b7097775b746b2de957'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:28:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:28:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:28:35 np0005538960 nova_compute[187252]: 2025-11-28 16:28:35.943 187256 INFO nova.compute.manager [None req-5c21b3d6-6726-4e53-8b63-c147ff43c691 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Get console output#033[00m
Nov 28 11:28:35 np0005538960 nova_compute[187252]: 2025-11-28 16:28:35.948 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:28:38 np0005538960 nova_compute[187252]: 2025-11-28 16:28:38.148 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:39 np0005538960 podman[220153]: 2025-11-28 16:28:39.200975323 +0000 UTC m=+0.100608437 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 11:28:39 np0005538960 nova_compute[187252]: 2025-11-28 16:28:39.935 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:41 np0005538960 nova_compute[187252]: 2025-11-28 16:28:41.824 187256 DEBUG nova.compute.manager [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Nov 28 11:28:41 np0005538960 nova_compute[187252]: 2025-11-28 16:28:41.961 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:41 np0005538960 nova_compute[187252]: 2025-11-28 16:28:41.962 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:41 np0005538960 nova_compute[187252]: 2025-11-28 16:28:41.985 187256 DEBUG nova.objects.instance [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'pci_requests' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:28:41 np0005538960 nova_compute[187252]: 2025-11-28 16:28:41.998 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:28:41 np0005538960 nova_compute[187252]: 2025-11-28 16:28:41.999 187256 INFO nova.compute.claims [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:41.999 187256 DEBUG nova.objects.instance [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'resources' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.012 187256 DEBUG nova.objects.instance [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'pci_devices' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.058 187256 INFO nova.compute.resource_tracker [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating resource usage from migration afda9bf2-339b-446f-9d51-45c46989758c#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.144 187256 DEBUG nova.compute.provider_tree [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.162 187256 DEBUG nova.scheduler.client.report [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.183 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.183 187256 INFO nova.compute.manager [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Migrating#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.217 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.218 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:28:42 np0005538960 nova_compute[187252]: 2025-11-28 16:28:42.218 187256 DEBUG nova.network.neutron [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:28:43 np0005538960 nova_compute[187252]: 2025-11-28 16:28:43.151 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:43 np0005538960 podman[220180]: 2025-11-28 16:28:43.158842388 +0000 UTC m=+0.064049303 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 28 11:28:43 np0005538960 podman[220179]: 2025-11-28 16:28:43.159411703 +0000 UTC m=+0.066412953 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:28:44 np0005538960 nova_compute[187252]: 2025-11-28 16:28:44.443 187256 DEBUG nova.network.neutron [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating instance_info_cache with network_info: [{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:28:44 np0005538960 nova_compute[187252]: 2025-11-28 16:28:44.460 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:28:44 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:44Z|00158|binding|INFO|Releasing lport 2258088d-2e87-495c-97b9-2f7fdc663c51 from this chassis (sb_readonly=0)
Nov 28 11:28:44 np0005538960 nova_compute[187252]: 2025-11-28 16:28:44.515 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:44 np0005538960 nova_compute[187252]: 2025-11-28 16:28:44.609 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Nov 28 11:28:44 np0005538960 nova_compute[187252]: 2025-11-28 16:28:44.614 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 28 11:28:44 np0005538960 nova_compute[187252]: 2025-11-28 16:28:44.938 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:46 np0005538960 kernel: tap1a93323f-5f (unregistering): left promiscuous mode
Nov 28 11:28:46 np0005538960 NetworkManager[55548]: <info>  [1764347326.7985] device (tap1a93323f-5f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:28:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:46Z|00159|binding|INFO|Releasing lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa from this chassis (sb_readonly=0)
Nov 28 11:28:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:46Z|00160|binding|INFO|Setting lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa down in Southbound
Nov 28 11:28:46 np0005538960 nova_compute[187252]: 2025-11-28 16:28:46.804 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:46Z|00161|binding|INFO|Removing iface tap1a93323f-5f ovn-installed in OVS
Nov 28 11:28:46 np0005538960 nova_compute[187252]: 2025-11-28 16:28:46.807 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:46.817 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:5c:b3 10.100.0.6'], port_security=['fa:16:3e:c0:5c:b3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ce57eab6-06a0-474d-a6ea-12656babb7ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1984d677-5f2f-49ac-a5f9-2343abc938a7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=1a93323f-5f62-458a-b2d5-7d1745ecf9aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:28:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:46.818 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa in datapath 6b8cb1a4-1232-45b7-a54f-e85635df6a5a unbound from our chassis#033[00m
Nov 28 11:28:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:46.820 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b8cb1a4-1232-45b7-a54f-e85635df6a5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:28:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:46.821 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3e384e14-39c5-4f5a-a0df-568c1454f808]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:46.821 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a namespace which is not needed anymore#033[00m
Nov 28 11:28:46 np0005538960 nova_compute[187252]: 2025-11-28 16:28:46.824 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:46 np0005538960 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000022.scope: Deactivated successfully.
Nov 28 11:28:46 np0005538960 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000022.scope: Consumed 14.679s CPU time.
Nov 28 11:28:46 np0005538960 systemd-machined[153518]: Machine qemu-11-instance-00000022 terminated.
Nov 28 11:28:47 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220026]: [NOTICE]   (220030) : haproxy version is 2.8.14-c23fe91
Nov 28 11:28:47 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220026]: [NOTICE]   (220030) : path to executable is /usr/sbin/haproxy
Nov 28 11:28:47 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220026]: [WARNING]  (220030) : Exiting Master process...
Nov 28 11:28:47 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220026]: [WARNING]  (220030) : Exiting Master process...
Nov 28 11:28:47 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220026]: [ALERT]    (220030) : Current worker (220032) exited with code 143 (Terminated)
Nov 28 11:28:47 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220026]: [WARNING]  (220030) : All workers exited. Exiting... (0)
Nov 28 11:28:47 np0005538960 systemd[1]: libpod-e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396.scope: Deactivated successfully.
Nov 28 11:28:47 np0005538960 podman[220242]: 2025-11-28 16:28:47.119504932 +0000 UTC m=+0.198419453 container died e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:28:47 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396-userdata-shm.mount: Deactivated successfully.
Nov 28 11:28:47 np0005538960 systemd[1]: var-lib-containers-storage-overlay-05109c14401fac5fc143dad3f68beb4346853578ad96ac42f48b56c00fe127ad-merged.mount: Deactivated successfully.
Nov 28 11:28:47 np0005538960 podman[220242]: 2025-11-28 16:28:47.179531667 +0000 UTC m=+0.258446168 container cleanup e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 11:28:47 np0005538960 systemd[1]: libpod-conmon-e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396.scope: Deactivated successfully.
Nov 28 11:28:47 np0005538960 podman[220288]: 2025-11-28 16:28:47.248941731 +0000 UTC m=+0.043449581 container remove e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.255 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d912229a-d906-4f10-8761-9dbf6e0e1567]: (4, ('Fri Nov 28 04:28:46 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a (e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396)\ne85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396\nFri Nov 28 04:28:47 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a (e85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396)\ne85690b192165135b6f8159dfc3d2f498b32a9e4da83d7ba7b02e1bd4e62d396\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.257 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[091d8cc2-c02b-4755-944d-7f1c1d2ca428]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.259 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b8cb1a4-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.261 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:47 np0005538960 kernel: tap6b8cb1a4-10: left promiscuous mode
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.281 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.284 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[786b095a-643d-4016-8bc9-db494723fa28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.297 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[65f5b1af-2b8f-4563-86fa-04423e454a74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:47 np0005538960 podman[220289]: 2025-11-28 16:28:47.299159916 +0000 UTC m=+0.076326003 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.299 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[df70f89c-f072-4e9b-8d06-7f2bdd935bfe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.314 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[58c8e973-1405-4633-bc7d-d3ee8d5f78bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439276, 'reachable_time': 17605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220331, 'error': None, 'target': 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:47 np0005538960 systemd[1]: run-netns-ovnmeta\x2d6b8cb1a4\x2d1232\x2d45b7\x2da54f\x2de85635df6a5a.mount: Deactivated successfully.
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.319 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:28:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:47.319 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[7015d847-64a6-4d47-910c-8b835837d395]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.440 187256 DEBUG nova.compute.manager [req-c154f347-75fb-4cba-8d98-f174c17ae28e req-22265c41-1b3a-4f17-92a2-b76fa9e80c4b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-unplugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.440 187256 DEBUG oslo_concurrency.lockutils [req-c154f347-75fb-4cba-8d98-f174c17ae28e req-22265c41-1b3a-4f17-92a2-b76fa9e80c4b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.440 187256 DEBUG oslo_concurrency.lockutils [req-c154f347-75fb-4cba-8d98-f174c17ae28e req-22265c41-1b3a-4f17-92a2-b76fa9e80c4b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.441 187256 DEBUG oslo_concurrency.lockutils [req-c154f347-75fb-4cba-8d98-f174c17ae28e req-22265c41-1b3a-4f17-92a2-b76fa9e80c4b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.441 187256 DEBUG nova.compute.manager [req-c154f347-75fb-4cba-8d98-f174c17ae28e req-22265c41-1b3a-4f17-92a2-b76fa9e80c4b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] No waiting events found dispatching network-vif-unplugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.441 187256 WARNING nova.compute.manager [req-c154f347-75fb-4cba-8d98-f174c17ae28e req-22265c41-1b3a-4f17-92a2-b76fa9e80c4b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received unexpected event network-vif-unplugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa for instance with vm_state active and task_state resize_migrating.#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.634 187256 INFO nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance shutdown successfully after 3 seconds.#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.640 187256 INFO nova.virt.libvirt.driver [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance destroyed successfully.#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.642 187256 DEBUG nova.virt.libvirt.vif [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1797102267',display_name='tempest-TestNetworkAdvancedServerOps-server-1797102267',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1797102267',id=34,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJWXQNoq9i14fJ2oMEnvHKoj5z47pddC9rW89mHqqhCU86I3zCb06Mwj3vaxCF0TkOfIjQvY2u8ugQzxEW+sF51eYGiPUBoZyEa/18LzqhM/C6ulPxonirUoF5gi5rGeSw==',key_name='tempest-TestNetworkAdvancedServerOps-1947953276',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:28:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-cjqomwv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:28:41Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=c4c0b9e1-fd20-4bc8-b105-53c8be08942f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2139115122", "vif_mac": "fa:16:3e:c0:5c:b3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.642 187256 DEBUG nova.network.os_vif_util [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2139115122", "vif_mac": "fa:16:3e:c0:5c:b3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.643 187256 DEBUG nova.network.os_vif_util [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.643 187256 DEBUG os_vif [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.645 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.646 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a93323f-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.647 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.649 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.652 187256 INFO os_vif [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f')#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.657 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.723 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.724 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.785 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.787 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f_resize/disk /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.821 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "cp -r /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f_resize/disk /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.822 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f_resize/disk.config /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.861 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "cp -r /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f_resize/disk.config /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.863 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): cp -r /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f_resize/disk.info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:47 np0005538960 nova_compute[187252]: 2025-11-28 16:28:47.896 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "cp -r /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f_resize/disk.info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.info" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:48 np0005538960 nova_compute[187252]: 2025-11-28 16:28:48.153 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:48 np0005538960 nova_compute[187252]: 2025-11-28 16:28:48.321 187256 DEBUG nova.network.neutron [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa binding to destination host compute-1.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Nov 28 11:28:48 np0005538960 nova_compute[187252]: 2025-11-28 16:28:48.495 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:48 np0005538960 nova_compute[187252]: 2025-11-28 16:28:48.495 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:48 np0005538960 nova_compute[187252]: 2025-11-28 16:28:48.496 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.035 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.035 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.036 187256 DEBUG nova.network.neutron [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.596 187256 DEBUG nova.compute.manager [req-c3364fdb-7c88-4616-9ab7-432eea4c3e81 req-a76e0f1e-5267-4dac-942c-53e50664481f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.597 187256 DEBUG oslo_concurrency.lockutils [req-c3364fdb-7c88-4616-9ab7-432eea4c3e81 req-a76e0f1e-5267-4dac-942c-53e50664481f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.597 187256 DEBUG oslo_concurrency.lockutils [req-c3364fdb-7c88-4616-9ab7-432eea4c3e81 req-a76e0f1e-5267-4dac-942c-53e50664481f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.597 187256 DEBUG oslo_concurrency.lockutils [req-c3364fdb-7c88-4616-9ab7-432eea4c3e81 req-a76e0f1e-5267-4dac-942c-53e50664481f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.597 187256 DEBUG nova.compute.manager [req-c3364fdb-7c88-4616-9ab7-432eea4c3e81 req-a76e0f1e-5267-4dac-942c-53e50664481f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] No waiting events found dispatching network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:28:49 np0005538960 nova_compute[187252]: 2025-11-28 16:28:49.598 187256 WARNING nova.compute.manager [req-c3364fdb-7c88-4616-9ab7-432eea4c3e81 req-a76e0f1e-5267-4dac-942c-53e50664481f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received unexpected event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa for instance with vm_state active and task_state resize_migrated.#033[00m
Nov 28 11:28:52 np0005538960 podman[220342]: 2025-11-28 16:28:52.16862231 +0000 UTC m=+0.067730894 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, vcs-type=git, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41)
Nov 28 11:28:52 np0005538960 nova_compute[187252]: 2025-11-28 16:28:52.648 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:53 np0005538960 nova_compute[187252]: 2025-11-28 16:28:53.154 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.192 187256 DEBUG nova.network.neutron [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating instance_info_cache with network_info: [{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.223 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.368 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.370 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.371 187256 INFO nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Creating image(s)#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.372 187256 DEBUG nova.objects.instance [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'trusted_certs' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.393 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.456 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.457 187256 DEBUG nova.virt.disk.api [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Checking if we can resize image /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.457 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.522 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.523 187256 DEBUG nova.virt.disk.api [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Cannot resize image /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.538 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.539 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Ensure instance console log exists: /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.539 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.540 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.540 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.543 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Start _get_guest_xml network_info=[{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2139115122", "vif_mac": "fa:16:3e:c0:5c:b3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.549 187256 WARNING nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.558 187256 DEBUG nova.virt.libvirt.host [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.559 187256 DEBUG nova.virt.libvirt.host [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.563 187256 DEBUG nova.virt.libvirt.host [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.564 187256 DEBUG nova.virt.libvirt.host [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.567 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.567 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='60d3f730-7668-4a83-b596-bc00400d7294',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.568 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.569 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.569 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.570 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.570 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.570 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.571 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.571 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.571 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.572 187256 DEBUG nova.virt.hardware [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.572 187256 DEBUG nova.objects.instance [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'vcpu_model' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.589 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.660 187256 DEBUG oslo_concurrency.processutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.661 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.662 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.663 187256 DEBUG oslo_concurrency.lockutils [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.664 187256 DEBUG nova.virt.libvirt.vif [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1797102267',display_name='tempest-TestNetworkAdvancedServerOps-server-1797102267',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1797102267',id=34,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJWXQNoq9i14fJ2oMEnvHKoj5z47pddC9rW89mHqqhCU86I3zCb06Mwj3vaxCF0TkOfIjQvY2u8ugQzxEW+sF51eYGiPUBoZyEa/18LzqhM/C6ulPxonirUoF5gi5rGeSw==',key_name='tempest-TestNetworkAdvancedServerOps-1947953276',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:28:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-cjqomwv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:28:48Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=c4c0b9e1-fd20-4bc8-b105-53c8be08942f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2139115122", "vif_mac": "fa:16:3e:c0:5c:b3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.664 187256 DEBUG nova.network.os_vif_util [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2139115122", "vif_mac": "fa:16:3e:c0:5c:b3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.665 187256 DEBUG nova.network.os_vif_util [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.667 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <uuid>c4c0b9e1-fd20-4bc8-b105-53c8be08942f</uuid>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <name>instance-00000022</name>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <memory>196608</memory>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1797102267</nova:name>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:28:54</nova:creationTime>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.micro">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        <nova:memory>192</nova:memory>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        <nova:user uuid="5d381eba17324dd5ad798648b82d0115">tempest-TestNetworkAdvancedServerOps-762685809-project-member</nova:user>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        <nova:project uuid="7e408bace48b41a1ac0677d300b6d288">tempest-TestNetworkAdvancedServerOps-762685809</nova:project>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        <nova:port uuid="1a93323f-5f62-458a-b2d5-7d1745ecf9aa">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <entry name="serial">c4c0b9e1-fd20-4bc8-b105-53c8be08942f</entry>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <entry name="uuid">c4c0b9e1-fd20-4bc8-b105-53c8be08942f</entry>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/disk.config"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:c0:5c:b3"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <target dev="tap1a93323f-5f"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f/console.log" append="off"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:28:54 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:28:54 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:28:54 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:28:54 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.669 187256 DEBUG nova.virt.libvirt.vif [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1797102267',display_name='tempest-TestNetworkAdvancedServerOps-server-1797102267',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1797102267',id=34,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJWXQNoq9i14fJ2oMEnvHKoj5z47pddC9rW89mHqqhCU86I3zCb06Mwj3vaxCF0TkOfIjQvY2u8ugQzxEW+sF51eYGiPUBoZyEa/18LzqhM/C6ulPxonirUoF5gi5rGeSw==',key_name='tempest-TestNetworkAdvancedServerOps-1947953276',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:28:15Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-cjqomwv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:28:48Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=c4c0b9e1-fd20-4bc8-b105-53c8be08942f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2139115122", "vif_mac": "fa:16:3e:c0:5c:b3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.670 187256 DEBUG nova.network.os_vif_util [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--2139115122", "vif_mac": "fa:16:3e:c0:5c:b3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.670 187256 DEBUG nova.network.os_vif_util [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.671 187256 DEBUG os_vif [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.671 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.672 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.672 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.675 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.676 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1a93323f-5f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.676 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1a93323f-5f, col_values=(('external_ids', {'iface-id': '1a93323f-5f62-458a-b2d5-7d1745ecf9aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c0:5c:b3', 'vm-uuid': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.678 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:54 np0005538960 NetworkManager[55548]: <info>  [1764347334.6793] manager: (tap1a93323f-5f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.680 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.684 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.684 187256 INFO os_vif [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f')#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.747 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.748 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.749 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No VIF found with MAC fa:16:3e:c0:5c:b3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.749 187256 INFO nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Using config drive#033[00m
Nov 28 11:28:54 np0005538960 kernel: tap1a93323f-5f: entered promiscuous mode
Nov 28 11:28:54 np0005538960 NetworkManager[55548]: <info>  [1764347334.8273] manager: (tap1a93323f-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Nov 28 11:28:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:54Z|00162|binding|INFO|Claiming lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa for this chassis.
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.886 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:54Z|00163|binding|INFO|1a93323f-5f62-458a-b2d5-7d1745ecf9aa: Claiming fa:16:3e:c0:5c:b3 10.100.0.6
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.895 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:5c:b3 10.100.0.6'], port_security=['fa:16:3e:c0:5c:b3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ce57eab6-06a0-474d-a6ea-12656babb7ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.180'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1984d677-5f2f-49ac-a5f9-2343abc938a7, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=1a93323f-5f62-458a-b2d5-7d1745ecf9aa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.897 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa in datapath 6b8cb1a4-1232-45b7-a54f-e85635df6a5a bound to our chassis#033[00m
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.898 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6b8cb1a4-1232-45b7-a54f-e85635df6a5a#033[00m
Nov 28 11:28:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:54Z|00164|binding|INFO|Setting lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa ovn-installed in OVS
Nov 28 11:28:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:54Z|00165|binding|INFO|Setting lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa up in Southbound
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.906 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:54 np0005538960 nova_compute[187252]: 2025-11-28 16:28:54.909 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:54 np0005538960 systemd-udevd[220389]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.912 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f622de05-554b-4ab2-bf48-47cb0d111e45]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.913 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6b8cb1a4-11 in ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.915 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6b8cb1a4-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.915 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[32fef4dc-ca97-4218-8040-1826ba6c4f73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.916 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d04a4390-ddcd-4ce4-86ae-830ed0ceccf8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:54 np0005538960 NetworkManager[55548]: <info>  [1764347334.9256] device (tap1a93323f-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:28:54 np0005538960 NetworkManager[55548]: <info>  [1764347334.9267] device (tap1a93323f-5f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.929 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf946f9-db16-47e3-86b9-84abb9d4fcfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:54 np0005538960 systemd-machined[153518]: New machine qemu-12-instance-00000022.
Nov 28 11:28:54 np0005538960 systemd[1]: Started Virtual Machine qemu-12-instance-00000022.
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.948 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0df5f1-0b41-49c0-9f77-4aff587c9ae5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.987 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[28fddd17-ec4b-47cf-8c2c-2d1722cf1239]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:54.992 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5d9c45cb-4dde-4a23-be55-bc37fbf91329]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:54 np0005538960 NetworkManager[55548]: <info>  [1764347334.9941] manager: (tap6b8cb1a4-10): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.033 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[0e198641-8e00-49fc-8204-2b311741580f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.038 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[03d21a82-f12f-442e-beee-d76ac8e2936c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 NetworkManager[55548]: <info>  [1764347335.0692] device (tap6b8cb1a4-10): carrier: link connected
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.074 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c07c3a-10f9-4152-a520-80aedd171027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.097 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6ce4a98f-156f-447a-9794-70b039962310]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b8cb1a4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:56:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443464, 'reachable_time': 30705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220424, 'error': None, 'target': 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.117 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[56e1e377-025b-4f55-b9e0-7adbfcf02584]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:5618'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 443464, 'tstamp': 443464}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220425, 'error': None, 'target': 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.138 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[391a9334-2e3d-4dcb-af0b-faaafdda3d60]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6b8cb1a4-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:56:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443464, 'reachable_time': 30705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220428, 'error': None, 'target': 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.172 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d47163c0-f376-4596-9182-9cedd1772ae1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.222 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Removed pending event for c4c0b9e1-fd20-4bc8-b105-53c8be08942f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.223 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347335.22178, c4c0b9e1-fd20-4bc8-b105-53c8be08942f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.224 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.226 187256 DEBUG nova.compute.manager [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.232 187256 INFO nova.virt.libvirt.driver [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance running successfully.#033[00m
Nov 28 11:28:55 np0005538960 virtqemud[186797]: argument unsupported: QEMU guest agent is not configured
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.235 187256 DEBUG nova.virt.libvirt.guest [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.235 187256 DEBUG nova.virt.libvirt.driver [None req-38d480b2-4573-409c-b3bc-a99b99d84218 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.246 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[212022a3-62da-4270-bbb5-147cce301f32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.248 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b8cb1a4-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.248 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.248 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b8cb1a4-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:55 np0005538960 kernel: tap6b8cb1a4-10: entered promiscuous mode
Nov 28 11:28:55 np0005538960 NetworkManager[55548]: <info>  [1764347335.2512] manager: (tap6b8cb1a4-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.250 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.252 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.253 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6b8cb1a4-10, col_values=(('external_ids', {'iface-id': '2258088d-2e87-495c-97b9-2f7fdc663c51'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:28:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:28:55Z|00166|binding|INFO|Releasing lport 2258088d-2e87-495c-97b9-2f7fdc663c51 from this chassis (sb_readonly=0)
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.255 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.256 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.257 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6b8cb1a4-1232-45b7-a54f-e85635df6a5a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6b8cb1a4-1232-45b7-a54f-e85635df6a5a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.258 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b8097cd7-8b72-4786-9024-dfe155f38935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.259 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-6b8cb1a4-1232-45b7-a54f-e85635df6a5a
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/6b8cb1a4-1232-45b7-a54f-e85635df6a5a.pid.haproxy
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID 6b8cb1a4-1232-45b7-a54f-e85635df6a5a
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:28:55 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:28:55.259 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'env', 'PROCESS_TAG=haproxy-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6b8cb1a4-1232-45b7-a54f-e85635df6a5a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.262 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.266 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.296 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.297 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347335.2233233, c4c0b9e1-fd20-4bc8-b105-53c8be08942f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.297 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] VM Started (Lifecycle Event)#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.339 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.349 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.575 187256 DEBUG nova.compute.manager [req-b0ba8994-1348-4eb3-9ad8-362f54f5936f req-e99f4686-148b-49df-bb38-f812d4de2431 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.576 187256 DEBUG oslo_concurrency.lockutils [req-b0ba8994-1348-4eb3-9ad8-362f54f5936f req-e99f4686-148b-49df-bb38-f812d4de2431 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.576 187256 DEBUG oslo_concurrency.lockutils [req-b0ba8994-1348-4eb3-9ad8-362f54f5936f req-e99f4686-148b-49df-bb38-f812d4de2431 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.577 187256 DEBUG oslo_concurrency.lockutils [req-b0ba8994-1348-4eb3-9ad8-362f54f5936f req-e99f4686-148b-49df-bb38-f812d4de2431 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.577 187256 DEBUG nova.compute.manager [req-b0ba8994-1348-4eb3-9ad8-362f54f5936f req-e99f4686-148b-49df-bb38-f812d4de2431 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] No waiting events found dispatching network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:28:55 np0005538960 nova_compute[187252]: 2025-11-28 16:28:55.577 187256 WARNING nova.compute.manager [req-b0ba8994-1348-4eb3-9ad8-362f54f5936f req-e99f4686-148b-49df-bb38-f812d4de2431 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received unexpected event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa for instance with vm_state resized and task_state None.#033[00m
Nov 28 11:28:55 np0005538960 podman[220465]: 2025-11-28 16:28:55.665797394 +0000 UTC m=+0.072507061 container create 5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:28:55 np0005538960 podman[220465]: 2025-11-28 16:28:55.624517127 +0000 UTC m=+0.031226904 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:28:55 np0005538960 systemd[1]: Started libpod-conmon-5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd.scope.
Nov 28 11:28:55 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:28:55 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88742d77195d44a08c1d1cd6d0107803a101189e2e8a599fea3b5c04c05a884c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:28:55 np0005538960 podman[220465]: 2025-11-28 16:28:55.782976724 +0000 UTC m=+0.189686421 container init 5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 11:28:55 np0005538960 podman[220465]: 2025-11-28 16:28:55.789196635 +0000 UTC m=+0.195906312 container start 5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 11:28:55 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220481]: [NOTICE]   (220485) : New worker (220488) forked
Nov 28 11:28:55 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220481]: [NOTICE]   (220485) : Loading success.
Nov 28 11:28:57 np0005538960 nova_compute[187252]: 2025-11-28 16:28:57.702 187256 DEBUG nova.compute.manager [req-baab08c0-4f6e-4318-9575-b49f8262d5f8 req-f467afa8-67b2-40c1-a3d7-e583c80d27f7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:28:57 np0005538960 nova_compute[187252]: 2025-11-28 16:28:57.703 187256 DEBUG oslo_concurrency.lockutils [req-baab08c0-4f6e-4318-9575-b49f8262d5f8 req-f467afa8-67b2-40c1-a3d7-e583c80d27f7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:57 np0005538960 nova_compute[187252]: 2025-11-28 16:28:57.703 187256 DEBUG oslo_concurrency.lockutils [req-baab08c0-4f6e-4318-9575-b49f8262d5f8 req-f467afa8-67b2-40c1-a3d7-e583c80d27f7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:57 np0005538960 nova_compute[187252]: 2025-11-28 16:28:57.703 187256 DEBUG oslo_concurrency.lockutils [req-baab08c0-4f6e-4318-9575-b49f8262d5f8 req-f467afa8-67b2-40c1-a3d7-e583c80d27f7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:28:57 np0005538960 nova_compute[187252]: 2025-11-28 16:28:57.703 187256 DEBUG nova.compute.manager [req-baab08c0-4f6e-4318-9575-b49f8262d5f8 req-f467afa8-67b2-40c1-a3d7-e583c80d27f7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] No waiting events found dispatching network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:28:57 np0005538960 nova_compute[187252]: 2025-11-28 16:28:57.703 187256 WARNING nova.compute.manager [req-baab08c0-4f6e-4318-9575-b49f8262d5f8 req-f467afa8-67b2-40c1-a3d7-e583c80d27f7 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received unexpected event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa for instance with vm_state resized and task_state None.#033[00m
Nov 28 11:28:58 np0005538960 nova_compute[187252]: 2025-11-28 16:28:58.122 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:28:58 np0005538960 nova_compute[187252]: 2025-11-28 16:28:58.123 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:28:58 np0005538960 nova_compute[187252]: 2025-11-28 16:28:58.123 187256 DEBUG nova.compute.manager [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Going to confirm migration 6 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Nov 28 11:28:58 np0005538960 nova_compute[187252]: 2025-11-28 16:28:58.159 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:28:58 np0005538960 nova_compute[187252]: 2025-11-28 16:28:58.440 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:28:58 np0005538960 nova_compute[187252]: 2025-11-28 16:28:58.441 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:28:58 np0005538960 nova_compute[187252]: 2025-11-28 16:28:58.441 187256 DEBUG nova.network.neutron [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:28:58 np0005538960 nova_compute[187252]: 2025-11-28 16:28:58.442 187256 DEBUG nova.objects.instance [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'info_cache' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:28:59 np0005538960 nova_compute[187252]: 2025-11-28 16:28:59.679 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:00 np0005538960 podman[220497]: 2025-11-28 16:29:00.169437919 +0000 UTC m=+0.073959446 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.439 187256 DEBUG nova.network.neutron [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating instance_info_cache with network_info: [{"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.461 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.462 187256 DEBUG nova.objects.instance [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'migration_context' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.476 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.477 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.577 187256 DEBUG nova.compute.provider_tree [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.593 187256 DEBUG nova.scheduler.client.report [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.648 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.773 187256 INFO nova.scheduler.client.report [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Deleted allocation for migration afda9bf2-339b-446f-9d51-45c46989758c#033[00m
Nov 28 11:29:01 np0005538960 nova_compute[187252]: 2025-11-28 16:29:01.835 187256 DEBUG oslo_concurrency.lockutils [None req-c4a55ee0-5e09-4ed2-bc4b-981b667d3e80 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.712s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:03 np0005538960 nova_compute[187252]: 2025-11-28 16:29:03.161 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:04 np0005538960 podman[220518]: 2025-11-28 16:29:04.17630094 +0000 UTC m=+0.066620417 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:29:04 np0005538960 nova_compute[187252]: 2025-11-28 16:29:04.680 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:06.349 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:06.350 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:06.351 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.462 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "62d23852-bb8c-4240-a037-86cfa1b3a07c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.463 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.476 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.564 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.565 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.572 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.572 187256 INFO nova.compute.claims [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.710 187256 DEBUG nova.compute.provider_tree [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.724 187256 DEBUG nova.scheduler.client.report [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.750 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.751 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.793 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.793 187256 DEBUG nova.network.neutron [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.806 187256 INFO nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.819 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.985 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.986 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.987 187256 INFO nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Creating image(s)#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.987 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "/var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.987 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:07 np0005538960 nova_compute[187252]: 2025-11-28 16:29:07.988 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "/var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.002 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.066 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.067 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.068 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.082 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.147 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.149 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.171 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.185 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.186 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.186 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.251 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.252 187256 DEBUG nova.virt.disk.api [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Checking if we can resize image /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.252 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.315 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.316 187256 DEBUG nova.virt.disk.api [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Cannot resize image /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.317 187256 DEBUG nova.objects.instance [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'migration_context' on Instance uuid 62d23852-bb8c-4240-a037-86cfa1b3a07c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.566 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.567 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Ensure instance console log exists: /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.567 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.568 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:08 np0005538960 nova_compute[187252]: 2025-11-28 16:29:08.568 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:08 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:08Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c0:5c:b3 10.100.0.6
Nov 28 11:29:09 np0005538960 nova_compute[187252]: 2025-11-28 16:29:09.683 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:10 np0005538960 podman[220574]: 2025-11-28 16:29:10.209821779 +0000 UTC m=+0.115850398 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_id=ovn_controller)
Nov 28 11:29:10 np0005538960 nova_compute[187252]: 2025-11-28 16:29:10.933 187256 DEBUG nova.policy [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a4105532118847f583e4bf7594336693', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:29:12 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:12.524 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:29:12 np0005538960 nova_compute[187252]: 2025-11-28 16:29:12.524 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:12 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:12.526 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:29:13 np0005538960 nova_compute[187252]: 2025-11-28 16:29:13.166 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:14 np0005538960 podman[220601]: 2025-11-28 16:29:14.157108367 +0000 UTC m=+0.060026926 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:29:14 np0005538960 podman[220602]: 2025-11-28 16:29:14.187355634 +0000 UTC m=+0.086473110 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125)
Nov 28 11:29:14 np0005538960 nova_compute[187252]: 2025-11-28 16:29:14.686 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.055 187256 INFO nova.compute.manager [None req-c128f826-4716-4338-9156-2776d4fb5092 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Get console output#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.061 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.135 187256 DEBUG nova.network.neutron [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Successfully updated port: 17fbb695-95a2-4ae4-86e1-5ef421f19484 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.151 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "refresh_cache-62d23852-bb8c-4240-a037-86cfa1b3a07c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.152 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquired lock "refresh_cache-62d23852-bb8c-4240-a037-86cfa1b3a07c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.152 187256 DEBUG nova.network.neutron [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.245 187256 DEBUG nova.compute.manager [req-56c795fc-f86f-400e-bce8-7f49c5211d39 req-b9f7f546-4806-4c17-ac12-dfe0df5d88c6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Received event network-changed-17fbb695-95a2-4ae4-86e1-5ef421f19484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.246 187256 DEBUG nova.compute.manager [req-56c795fc-f86f-400e-bce8-7f49c5211d39 req-b9f7f546-4806-4c17-ac12-dfe0df5d88c6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Refreshing instance network info cache due to event network-changed-17fbb695-95a2-4ae4-86e1-5ef421f19484. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:29:17 np0005538960 nova_compute[187252]: 2025-11-28 16:29:17.246 187256 DEBUG oslo_concurrency.lockutils [req-56c795fc-f86f-400e-bce8-7f49c5211d39 req-b9f7f546-4806-4c17-ac12-dfe0df5d88c6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-62d23852-bb8c-4240-a037-86cfa1b3a07c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:29:17 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:17.529 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:18 np0005538960 podman[220639]: 2025-11-28 16:29:18.142035054 +0000 UTC m=+0.051478927 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:29:18 np0005538960 nova_compute[187252]: 2025-11-28 16:29:18.168 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:19 np0005538960 nova_compute[187252]: 2025-11-28 16:29:19.687 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:19 np0005538960 nova_compute[187252]: 2025-11-28 16:29:19.998 187256 DEBUG nova.network.neutron [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.937 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.938 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.938 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.938 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.939 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.940 187256 INFO nova.compute.manager [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Terminating instance#033[00m
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.941 187256 DEBUG nova.compute.manager [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:29:20 np0005538960 kernel: tap1a93323f-5f (unregistering): left promiscuous mode
Nov 28 11:29:20 np0005538960 NetworkManager[55548]: <info>  [1764347360.9736] device (tap1a93323f-5f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:29:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:20Z|00167|binding|INFO|Releasing lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa from this chassis (sb_readonly=0)
Nov 28 11:29:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:20Z|00168|binding|INFO|Setting lport 1a93323f-5f62-458a-b2d5-7d1745ecf9aa down in Southbound
Nov 28 11:29:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:20Z|00169|binding|INFO|Removing iface tap1a93323f-5f ovn-installed in OVS
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.978 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:20 np0005538960 nova_compute[187252]: 2025-11-28 16:29:20.981 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:20.991 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c0:5c:b3 10.100.0.6'], port_security=['fa:16:3e:c0:5c:b3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c4c0b9e1-fd20-4bc8-b105-53c8be08942f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ce57eab6-06a0-474d-a6ea-12656babb7ac', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1984d677-5f2f-49ac-a5f9-2343abc938a7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=1a93323f-5f62-458a-b2d5-7d1745ecf9aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:29:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:20.993 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa in datapath 6b8cb1a4-1232-45b7-a54f-e85635df6a5a unbound from our chassis#033[00m
Nov 28 11:29:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:20.995 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6b8cb1a4-1232-45b7-a54f-e85635df6a5a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:29:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:20.996 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[af1c4db3-f610-4a15-a813-31e3d2a42ccd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:20.996 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a namespace which is not needed anymore#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.003 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:21 np0005538960 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000022.scope: Deactivated successfully.
Nov 28 11:29:21 np0005538960 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000022.scope: Consumed 13.877s CPU time.
Nov 28 11:29:21 np0005538960 systemd-machined[153518]: Machine qemu-12-instance-00000022 terminated.
Nov 28 11:29:21 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220481]: [NOTICE]   (220485) : haproxy version is 2.8.14-c23fe91
Nov 28 11:29:21 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220481]: [NOTICE]   (220485) : path to executable is /usr/sbin/haproxy
Nov 28 11:29:21 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220481]: [WARNING]  (220485) : Exiting Master process...
Nov 28 11:29:21 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220481]: [ALERT]    (220485) : Current worker (220488) exited with code 143 (Terminated)
Nov 28 11:29:21 np0005538960 neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a[220481]: [WARNING]  (220485) : All workers exited. Exiting... (0)
Nov 28 11:29:21 np0005538960 systemd[1]: libpod-5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd.scope: Deactivated successfully.
Nov 28 11:29:21 np0005538960 podman[220688]: 2025-11-28 16:29:21.166230535 +0000 UTC m=+0.058068328 container died 5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.170 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.175 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:21 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd-userdata-shm.mount: Deactivated successfully.
Nov 28 11:29:21 np0005538960 systemd[1]: var-lib-containers-storage-overlay-88742d77195d44a08c1d1cd6d0107803a101189e2e8a599fea3b5c04c05a884c-merged.mount: Deactivated successfully.
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.217 187256 INFO nova.virt.libvirt.driver [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Instance destroyed successfully.#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.219 187256 DEBUG nova.objects.instance [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'resources' on Instance uuid c4c0b9e1-fd20-4bc8-b105-53c8be08942f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:29:21 np0005538960 podman[220688]: 2025-11-28 16:29:21.221993486 +0000 UTC m=+0.113831279 container cleanup 5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, org.label-schema.build-date=20251125, tcib_managed=true, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:29:21 np0005538960 systemd[1]: libpod-conmon-5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd.scope: Deactivated successfully.
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.240 187256 DEBUG nova.virt.libvirt.vif [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:28:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1797102267',display_name='tempest-TestNetworkAdvancedServerOps-server-1797102267',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1797102267',id=34,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJWXQNoq9i14fJ2oMEnvHKoj5z47pddC9rW89mHqqhCU86I3zCb06Mwj3vaxCF0TkOfIjQvY2u8ugQzxEW+sF51eYGiPUBoZyEa/18LzqhM/C6ulPxonirUoF5gi5rGeSw==',key_name='tempest-TestNetworkAdvancedServerOps-1947953276',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:28:55Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-cjqomwv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:29:01Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=c4c0b9e1-fd20-4bc8-b105-53c8be08942f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.241 187256 DEBUG nova.network.os_vif_util [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "address": "fa:16:3e:c0:5c:b3", "network": {"id": "6b8cb1a4-1232-45b7-a54f-e85635df6a5a", "bridge": "br-int", "label": "tempest-network-smoke--2139115122", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1a93323f-5f", "ovs_interfaceid": "1a93323f-5f62-458a-b2d5-7d1745ecf9aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.242 187256 DEBUG nova.network.os_vif_util [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.242 187256 DEBUG os_vif [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.244 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.244 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1a93323f-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.246 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.248 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.250 187256 INFO os_vif [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c0:5c:b3,bridge_name='br-int',has_traffic_filtering=True,id=1a93323f-5f62-458a-b2d5-7d1745ecf9aa,network=Network(6b8cb1a4-1232-45b7-a54f-e85635df6a5a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1a93323f-5f')#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.251 187256 INFO nova.virt.libvirt.driver [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Deleting instance files /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f_del#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.252 187256 INFO nova.virt.libvirt.driver [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Deletion of /var/lib/nova/instances/c4c0b9e1-fd20-4bc8-b105-53c8be08942f_del complete#033[00m
Nov 28 11:29:21 np0005538960 podman[220731]: 2025-11-28 16:29:21.292051126 +0000 UTC m=+0.045929032 container remove 5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.298 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[75381d18-e2fb-4b83-af66-0bf93e595f47]: (4, ('Fri Nov 28 04:29:21 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a (5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd)\n5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd\nFri Nov 28 04:29:21 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a (5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd)\n5101c15ba0b2c880869508c6f5db6255fbb848cd2cd95b0b9028e14706c906dd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.300 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[0a143052-f840-4146-9035-a7300191d6d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.301 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b8cb1a4-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.303 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:21 np0005538960 kernel: tap6b8cb1a4-10: left promiscuous mode
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.306 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.309 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[14d392f5-f992-43ac-b63b-240171a5c690]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.318 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.331 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[301eb722-1d84-41c0-9729-579d8840d4bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.334 187256 INFO nova.compute.manager [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.335 187256 DEBUG oslo.service.loopingcall [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.335 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6882bb-b464-4365-a27e-8701cdd55ca2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.335 187256 DEBUG nova.compute.manager [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:29:21 np0005538960 nova_compute[187252]: 2025-11-28 16:29:21.335 187256 DEBUG nova.network.neutron [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.354 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[85d42128-c7be-492d-a91e-64595a9d06f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 443456, 'reachable_time': 26572, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220746, 'error': None, 'target': 'ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.357 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6b8cb1a4-1232-45b7-a54f-e85635df6a5a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:29:21 np0005538960 systemd[1]: run-netns-ovnmeta\x2d6b8cb1a4\x2d1232\x2d45b7\x2da54f\x2de85635df6a5a.mount: Deactivated successfully.
Nov 28 11:29:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:21.357 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[a56e8e24-d3bf-42ac-9f81-eeb791a03c80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.252 187256 DEBUG nova.network.neutron [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Updating instance_info_cache with network_info: [{"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.278 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Releasing lock "refresh_cache-62d23852-bb8c-4240-a037-86cfa1b3a07c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.279 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Instance network_info: |[{"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.279 187256 DEBUG oslo_concurrency.lockutils [req-56c795fc-f86f-400e-bce8-7f49c5211d39 req-b9f7f546-4806-4c17-ac12-dfe0df5d88c6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-62d23852-bb8c-4240-a037-86cfa1b3a07c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.279 187256 DEBUG nova.network.neutron [req-56c795fc-f86f-400e-bce8-7f49c5211d39 req-b9f7f546-4806-4c17-ac12-dfe0df5d88c6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Refreshing network info cache for port 17fbb695-95a2-4ae4-86e1-5ef421f19484 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.282 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Start _get_guest_xml network_info=[{"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.289 187256 WARNING nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.298 187256 DEBUG nova.virt.libvirt.host [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.299 187256 DEBUG nova.virt.libvirt.host [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.303 187256 DEBUG nova.virt.libvirt.host [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.304 187256 DEBUG nova.virt.libvirt.host [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.305 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.305 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.306 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.306 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.306 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.307 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.307 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.307 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.308 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.308 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.308 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.308 187256 DEBUG nova.virt.hardware [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.313 187256 DEBUG nova.virt.libvirt.vif [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:29:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1332009570',display_name='tempest-TestNetworkBasicOps-server-1332009570',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1332009570',id=38,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ0RTFDs7EAnXlW6axzbeQzyPz2JX8FpFU+5BzPCu2vdnl062oP4eGavf8/eCy1TDBc3eW1tX7zh69rkuU/5qVDbHgMYHK+wBETL/mOYZZgjsNWcxYMaAfdQcdMxRjmtA==',key_name='tempest-TestNetworkBasicOps-1440030787',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-03twp9uz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:29:07Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=62d23852-bb8c-4240-a037-86cfa1b3a07c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.313 187256 DEBUG nova.network.os_vif_util [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.314 187256 DEBUG nova.network.os_vif_util [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:96:1d,bridge_name='br-int',has_traffic_filtering=True,id=17fbb695-95a2-4ae4-86e1-5ef421f19484,network=Network(190e04f9-d028-441a-93bf-e8d4ff728b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap17fbb695-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.315 187256 DEBUG nova.objects.instance [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'pci_devices' on Instance uuid 62d23852-bb8c-4240-a037-86cfa1b3a07c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.333 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <uuid>62d23852-bb8c-4240-a037-86cfa1b3a07c</uuid>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <name>instance-00000026</name>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkBasicOps-server-1332009570</nova:name>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:29:22</nova:creationTime>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        <nova:user uuid="a4105532118847f583e4bf7594336693">tempest-TestNetworkBasicOps-561116586-project-member</nova:user>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        <nova:project uuid="d9d93d32b42b46fbb1392048dd8941bb">tempest-TestNetworkBasicOps-561116586</nova:project>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        <nova:port uuid="17fbb695-95a2-4ae4-86e1-5ef421f19484">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <entry name="serial">62d23852-bb8c-4240-a037-86cfa1b3a07c</entry>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <entry name="uuid">62d23852-bb8c-4240-a037-86cfa1b3a07c</entry>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk.config"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:55:96:1d"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <target dev="tap17fbb695-95"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/console.log" append="off"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:29:22 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:29:22 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:29:22 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:29:22 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.335 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Preparing to wait for external event network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.335 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.336 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.336 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.337 187256 DEBUG nova.virt.libvirt.vif [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:29:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1332009570',display_name='tempest-TestNetworkBasicOps-server-1332009570',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1332009570',id=38,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ0RTFDs7EAnXlW6axzbeQzyPz2JX8FpFU+5BzPCu2vdnl062oP4eGavf8/eCy1TDBc3eW1tX7zh69rkuU/5qVDbHgMYHK+wBETL/mOYZZgjsNWcxYMaAfdQcdMxRjmtA==',key_name='tempest-TestNetworkBasicOps-1440030787',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-03twp9uz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:29:07Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=62d23852-bb8c-4240-a037-86cfa1b3a07c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.338 187256 DEBUG nova.network.os_vif_util [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.338 187256 DEBUG nova.network.os_vif_util [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:96:1d,bridge_name='br-int',has_traffic_filtering=True,id=17fbb695-95a2-4ae4-86e1-5ef421f19484,network=Network(190e04f9-d028-441a-93bf-e8d4ff728b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap17fbb695-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.339 187256 DEBUG os_vif [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:96:1d,bridge_name='br-int',has_traffic_filtering=True,id=17fbb695-95a2-4ae4-86e1-5ef421f19484,network=Network(190e04f9-d028-441a-93bf-e8d4ff728b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap17fbb695-95') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.340 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.340 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.341 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.344 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.344 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap17fbb695-95, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.345 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap17fbb695-95, col_values=(('external_ids', {'iface-id': '17fbb695-95a2-4ae4-86e1-5ef421f19484', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:55:96:1d', 'vm-uuid': '62d23852-bb8c-4240-a037-86cfa1b3a07c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.346 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:22 np0005538960 NetworkManager[55548]: <info>  [1764347362.3478] manager: (tap17fbb695-95): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.349 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.353 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.354 187256 INFO os_vif [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:96:1d,bridge_name='br-int',has_traffic_filtering=True,id=17fbb695-95a2-4ae4-86e1-5ef421f19484,network=Network(190e04f9-d028-441a-93bf-e8d4ff728b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap17fbb695-95')#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.438 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.438 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.439 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] No VIF found with MAC fa:16:3e:55:96:1d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:29:22 np0005538960 nova_compute[187252]: 2025-11-28 16:29:22.439 187256 INFO nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Using config drive#033[00m
Nov 28 11:29:22 np0005538960 podman[220749]: 2025-11-28 16:29:22.451463959 +0000 UTC m=+0.063570533 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, architecture=x86_64, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 11:29:23 np0005538960 nova_compute[187252]: 2025-11-28 16:29:23.209 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:23 np0005538960 nova_compute[187252]: 2025-11-28 16:29:23.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:23 np0005538960 nova_compute[187252]: 2025-11-28 16:29:23.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.344 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.344 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.345 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.345 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.356 187256 DEBUG nova.compute.manager [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-changed-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.356 187256 DEBUG nova.compute.manager [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Refreshing instance network info cache due to event network-changed-1a93323f-5f62-458a-b2d5-7d1745ecf9aa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.356 187256 DEBUG oslo_concurrency.lockutils [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.356 187256 DEBUG oslo_concurrency.lockutils [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.357 187256 DEBUG nova.network.neutron [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Refreshing network info cache for port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.496 187256 DEBUG nova.network.neutron [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.524 187256 INFO nova.compute.manager [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Took 3.19 seconds to deallocate network for instance.#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.591 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.592 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.596 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.689 187256 INFO nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Creating config drive at /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk.config#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.694 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb_ii397f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.716 187256 INFO nova.scheduler.client.report [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Deleted allocations for instance c4c0b9e1-fd20-4bc8-b105-53c8be08942f#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.719 187256 INFO nova.network.neutron [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Port 1a93323f-5f62-458a-b2d5-7d1745ecf9aa from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.720 187256 DEBUG nova.network.neutron [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.740 187256 DEBUG oslo_concurrency.lockutils [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-c4c0b9e1-fd20-4bc8-b105-53c8be08942f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.741 187256 DEBUG nova.compute.manager [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-unplugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.741 187256 DEBUG oslo_concurrency.lockutils [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.742 187256 DEBUG oslo_concurrency.lockutils [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.743 187256 DEBUG oslo_concurrency.lockutils [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.743 187256 DEBUG nova.compute.manager [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] No waiting events found dispatching network-vif-unplugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.743 187256 DEBUG nova.compute.manager [req-89fe050d-20c5-4dac-ad70-9ed3be7883a5 req-2cd78f89-b7b0-403c-9845-a93796eab185 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-unplugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.822 187256 DEBUG oslo_concurrency.lockutils [None req-c8319578-dc45-4558-b779-c686af22ec93 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.824 187256 DEBUG oslo_concurrency.processutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpb_ii397f" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:24 np0005538960 kernel: tap17fbb695-95: entered promiscuous mode
Nov 28 11:29:24 np0005538960 NetworkManager[55548]: <info>  [1764347364.9009] manager: (tap17fbb695-95): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Nov 28 11:29:24 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:24Z|00170|binding|INFO|Claiming lport 17fbb695-95a2-4ae4-86e1-5ef421f19484 for this chassis.
Nov 28 11:29:24 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:24Z|00171|binding|INFO|17fbb695-95a2-4ae4-86e1-5ef421f19484: Claiming fa:16:3e:55:96:1d 10.100.0.5
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.902 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.911 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:96:1d 10.100.0.5'], port_security=['fa:16:3e:55:96:1d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1231382668', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '62d23852-bb8c-4240-a037-86cfa1b3a07c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-190e04f9-d028-441a-93bf-e8d4ff728b31', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1231382668', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '7', 'neutron:security_group_ids': '4551c4f3-bc67-4846-a687-f0bfbc9398f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c938c518-d34f-4f96-8307-117b8a9fa334, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=17fbb695-95a2-4ae4-86e1-5ef421f19484) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.912 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 17fbb695-95a2-4ae4-86e1-5ef421f19484 in datapath 190e04f9-d028-441a-93bf-e8d4ff728b31 bound to our chassis#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.914 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 190e04f9-d028-441a-93bf-e8d4ff728b31#033[00m
Nov 28 11:29:24 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:24Z|00172|binding|INFO|Setting lport 17fbb695-95a2-4ae4-86e1-5ef421f19484 ovn-installed in OVS
Nov 28 11:29:24 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:24Z|00173|binding|INFO|Setting lport 17fbb695-95a2-4ae4-86e1-5ef421f19484 up in Southbound
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.917 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:24 np0005538960 nova_compute[187252]: 2025-11-28 16:29:24.923 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.927 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e6a747-320d-4423-acd1-922a620748f8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.928 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap190e04f9-d1 in ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.931 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap190e04f9-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.931 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[11ee6400-ce0a-49ec-aa0b-f93fa02d65b2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.932 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[de4d2f50-7ac1-455c-bea2-8c5dac64b569]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:24 np0005538960 systemd-udevd[220792]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.945 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[7f90f6eb-3899-4b7e-ad25-d6b683dd58ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:24 np0005538960 systemd-machined[153518]: New machine qemu-13-instance-00000026.
Nov 28 11:29:24 np0005538960 NetworkManager[55548]: <info>  [1764347364.9550] device (tap17fbb695-95): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:29:24 np0005538960 NetworkManager[55548]: <info>  [1764347364.9560] device (tap17fbb695-95): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:29:24 np0005538960 systemd[1]: Started Virtual Machine qemu-13-instance-00000026.
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.963 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b86acb04-cea2-4657-aa09-db220f9e472c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:24.998 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2a6605-d62e-4a7b-b9c7-52c37681e75b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.006 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e6478088-1ab3-4fd6-8420-7ac9a1badb92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 NetworkManager[55548]: <info>  [1764347365.0077] manager: (tap190e04f9-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/93)
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.044 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[fe1373e1-b7d2-4b33-b400-c842c09b7fea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.048 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[c6e2354c-2682-424f-b835-a8d3daa3df70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 NetworkManager[55548]: <info>  [1764347365.0854] device (tap190e04f9-d0): carrier: link connected
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.093 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[2976572e-83a4-419c-9ed7-ce514615e99b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.111 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[30fcc810-85ac-49d2-a495-3837ad7c9c3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap190e04f9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:bd:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446466, 'reachable_time': 38361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220823, 'error': None, 'target': 'ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.130 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb91079-5ea8-46d4-bea4-3ef9f4711790]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:bdc2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446466, 'tstamp': 446466}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220824, 'error': None, 'target': 'ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.135 187256 DEBUG nova.compute.manager [req-b4bd6a2e-d5e7-4846-94d4-fc7c0982ce4b req-ef1f8c7d-c212-4fcd-a1dd-44e99804897f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-deleted-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.150 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d4666b69-4e21-4caf-97b0-63261f58b904]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap190e04f9-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:bd:c2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 52], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446466, 'reachable_time': 38361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220825, 'error': None, 'target': 'ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.187 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea22441-b6d7-46db-86e2-18f0ebbc5b9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.265 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b880ee3d-be38-48a8-9e1c-9ebcb3031a06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.268 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap190e04f9-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.268 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.269 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap190e04f9-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.271 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:25 np0005538960 NetworkManager[55548]: <info>  [1764347365.2723] manager: (tap190e04f9-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Nov 28 11:29:25 np0005538960 kernel: tap190e04f9-d0: entered promiscuous mode
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.273 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.276 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap190e04f9-d0, col_values=(('external_ids', {'iface-id': '2cb95f44-afec-455d-89b0-45fb2a803d6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.277 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:25 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:25Z|00174|binding|INFO|Releasing lport 2cb95f44-afec-455d-89b0-45fb2a803d6d from this chassis (sb_readonly=0)
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.277 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.279 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/190e04f9-d028-441a-93bf-e8d4ff728b31.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/190e04f9-d028-441a-93bf-e8d4ff728b31.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.280 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[18456a52-733b-4536-8e5f-2876f9fe40d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.282 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-190e04f9-d028-441a-93bf-e8d4ff728b31
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/190e04f9-d028-441a-93bf-e8d4ff728b31.pid.haproxy
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID 190e04f9-d028-441a-93bf-e8d4ff728b31
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:29:25 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:25.283 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31', 'env', 'PROCESS_TAG=haproxy-190e04f9-d028-441a-93bf-e8d4ff728b31', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/190e04f9-d028-441a-93bf-e8d4ff728b31.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.288 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.345 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.346 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.346 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.346 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.377 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347365.3772392, 62d23852-bb8c-4240-a037-86cfa1b3a07c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.378 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] VM Started (Lifecycle Event)#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.411 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.417 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347365.3774524, 62d23852-bb8c-4240-a037-86cfa1b3a07c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.418 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.445 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.448 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.472 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.494 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.516 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.517 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.582 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:25 np0005538960 podman[220873]: 2025-11-28 16:29:25.674203745 +0000 UTC m=+0.058055477 container create 6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:29:25 np0005538960 systemd[1]: Started libpod-conmon-6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a.scope.
Nov 28 11:29:25 np0005538960 podman[220873]: 2025-11-28 16:29:25.640872102 +0000 UTC m=+0.024723854 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:29:25 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.752 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.753 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5684MB free_disk=73.33718872070312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.753 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.754 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:25 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/12033acb456d86518f29ac2f50aaa635103565f110873f8ccbfb43ce32174413/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:29:25 np0005538960 podman[220873]: 2025-11-28 16:29:25.77515853 +0000 UTC m=+0.159010462 container init 6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:29:25 np0005538960 podman[220873]: 2025-11-28 16:29:25.783038711 +0000 UTC m=+0.166890443 container start 6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:29:25 np0005538960 neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31[220889]: [NOTICE]   (220893) : New worker (220895) forked
Nov 28 11:29:25 np0005538960 neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31[220889]: [NOTICE]   (220893) : Loading success.
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.842 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance 62d23852-bb8c-4240-a037-86cfa1b3a07c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.842 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.842 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.863 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.893 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.893 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.928 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:29:25 np0005538960 nova_compute[187252]: 2025-11-28 16:29:25.963 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.041 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.061 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.099 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.099 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.346s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.498 187256 DEBUG nova.compute.manager [req-56b2fc5d-1516-41bc-88d5-2fe9933c110a req-a1fd5793-2c9b-412b-a778-8b3ec2d57aed 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.499 187256 DEBUG oslo_concurrency.lockutils [req-56b2fc5d-1516-41bc-88d5-2fe9933c110a req-a1fd5793-2c9b-412b-a778-8b3ec2d57aed 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.499 187256 DEBUG oslo_concurrency.lockutils [req-56b2fc5d-1516-41bc-88d5-2fe9933c110a req-a1fd5793-2c9b-412b-a778-8b3ec2d57aed 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.500 187256 DEBUG oslo_concurrency.lockutils [req-56b2fc5d-1516-41bc-88d5-2fe9933c110a req-a1fd5793-2c9b-412b-a778-8b3ec2d57aed 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "c4c0b9e1-fd20-4bc8-b105-53c8be08942f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.500 187256 DEBUG nova.compute.manager [req-56b2fc5d-1516-41bc-88d5-2fe9933c110a req-a1fd5793-2c9b-412b-a778-8b3ec2d57aed 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] No waiting events found dispatching network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:29:26 np0005538960 nova_compute[187252]: 2025-11-28 16:29:26.500 187256 WARNING nova.compute.manager [req-56b2fc5d-1516-41bc-88d5-2fe9933c110a req-a1fd5793-2c9b-412b-a778-8b3ec2d57aed 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Received unexpected event network-vif-plugged-1a93323f-5f62-458a-b2d5-7d1745ecf9aa for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.095 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.095 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.253 187256 DEBUG nova.network.neutron [req-56c795fc-f86f-400e-bce8-7f49c5211d39 req-b9f7f546-4806-4c17-ac12-dfe0df5d88c6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Updated VIF entry in instance network info cache for port 17fbb695-95a2-4ae4-86e1-5ef421f19484. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.254 187256 DEBUG nova.network.neutron [req-56c795fc-f86f-400e-bce8-7f49c5211d39 req-b9f7f546-4806-4c17-ac12-dfe0df5d88c6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Updating instance_info_cache with network_info: [{"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.274 187256 DEBUG oslo_concurrency.lockutils [req-56c795fc-f86f-400e-bce8-7f49c5211d39 req-b9f7f546-4806-4c17-ac12-dfe0df5d88c6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-62d23852-bb8c-4240-a037-86cfa1b3a07c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.350 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.593 187256 DEBUG nova.compute.manager [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Received event network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.594 187256 DEBUG oslo_concurrency.lockutils [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.595 187256 DEBUG oslo_concurrency.lockutils [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.595 187256 DEBUG oslo_concurrency.lockutils [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.595 187256 DEBUG nova.compute.manager [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Processing event network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.595 187256 DEBUG nova.compute.manager [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Received event network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.596 187256 DEBUG oslo_concurrency.lockutils [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.596 187256 DEBUG oslo_concurrency.lockutils [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.596 187256 DEBUG oslo_concurrency.lockutils [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.596 187256 DEBUG nova.compute.manager [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] No waiting events found dispatching network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.597 187256 WARNING nova.compute.manager [req-47b22cfe-b449-474f-af31-5630a25c102f req-641d4ed7-9f40-4d36-8b5a-ab1b4a3cd052 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Received unexpected event network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 for instance with vm_state building and task_state spawning.#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.598 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.602 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347367.602147, 62d23852-bb8c-4240-a037-86cfa1b3a07c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.603 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.605 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.609 187256 INFO nova.virt.libvirt.driver [-] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Instance spawned successfully.#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.610 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.628 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.636 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.639 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.640 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.640 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.640 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.641 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.641 187256 DEBUG nova.virt.libvirt.driver [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.673 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.721 187256 INFO nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Took 19.74 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.722 187256 DEBUG nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.788 187256 INFO nova.compute.manager [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Took 20.25 seconds to build instance.#033[00m
Nov 28 11:29:27 np0005538960 nova_compute[187252]: 2025-11-28 16:29:27.808 187256 DEBUG oslo_concurrency.lockutils [None req-99deff1f-e375-4291-b64c-a0b2723cbe59 a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.345s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:28 np0005538960 nova_compute[187252]: 2025-11-28 16:29:28.211 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:28 np0005538960 nova_compute[187252]: 2025-11-28 16:29:28.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:29:31 np0005538960 podman[220904]: 2025-11-28 16:29:31.174064053 +0000 UTC m=+0.070746708 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.793 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "62d23852-bb8c-4240-a037-86cfa1b3a07c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.794 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.795 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.795 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.795 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.797 187256 INFO nova.compute.manager [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Terminating instance#033[00m
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.797 187256 DEBUG nova.compute.manager [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:29:31 np0005538960 kernel: tap17fbb695-95 (unregistering): left promiscuous mode
Nov 28 11:29:31 np0005538960 NetworkManager[55548]: <info>  [1764347371.8201] device (tap17fbb695-95): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:29:31 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:31Z|00175|binding|INFO|Releasing lport 17fbb695-95a2-4ae4-86e1-5ef421f19484 from this chassis (sb_readonly=0)
Nov 28 11:29:31 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:31Z|00176|binding|INFO|Setting lport 17fbb695-95a2-4ae4-86e1-5ef421f19484 down in Southbound
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.834 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:31 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:31Z|00177|binding|INFO|Removing iface tap17fbb695-95 ovn-installed in OVS
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.850 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:31.853 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:55:96:1d 10.100.0.5'], port_security=['fa:16:3e:55:96:1d 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1231382668', 'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '62d23852-bb8c-4240-a037-86cfa1b3a07c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-190e04f9-d028-441a-93bf-e8d4ff728b31', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1231382668', 'neutron:project_id': 'd9d93d32b42b46fbb1392048dd8941bb', 'neutron:revision_number': '9', 'neutron:security_group_ids': '4551c4f3-bc67-4846-a687-f0bfbc9398f2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c938c518-d34f-4f96-8307-117b8a9fa334, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=17fbb695-95a2-4ae4-86e1-5ef421f19484) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:29:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:31.855 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 17fbb695-95a2-4ae4-86e1-5ef421f19484 in datapath 190e04f9-d028-441a-93bf-e8d4ff728b31 unbound from our chassis#033[00m
Nov 28 11:29:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:31.856 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 190e04f9-d028-441a-93bf-e8d4ff728b31, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:29:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:31.858 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[baacbca3-4a91-42c3-9173-fd1d9bf5c9eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:31.860 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31 namespace which is not needed anymore#033[00m
Nov 28 11:29:31 np0005538960 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000026.scope: Deactivated successfully.
Nov 28 11:29:31 np0005538960 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000026.scope: Consumed 4.697s CPU time.
Nov 28 11:29:31 np0005538960 systemd-machined[153518]: Machine qemu-13-instance-00000026 terminated.
Nov 28 11:29:31 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:31Z|00178|binding|INFO|Releasing lport 2cb95f44-afec-455d-89b0-45fb2a803d6d from this chassis (sb_readonly=0)
Nov 28 11:29:31 np0005538960 nova_compute[187252]: 2025-11-28 16:29:31.978 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:32 np0005538960 neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31[220889]: [NOTICE]   (220893) : haproxy version is 2.8.14-c23fe91
Nov 28 11:29:32 np0005538960 neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31[220889]: [NOTICE]   (220893) : path to executable is /usr/sbin/haproxy
Nov 28 11:29:32 np0005538960 neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31[220889]: [ALERT]    (220893) : Current worker (220895) exited with code 143 (Terminated)
Nov 28 11:29:32 np0005538960 neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31[220889]: [WARNING]  (220893) : All workers exited. Exiting... (0)
Nov 28 11:29:32 np0005538960 systemd[1]: libpod-6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a.scope: Deactivated successfully.
Nov 28 11:29:32 np0005538960 podman[220949]: 2025-11-28 16:29:32.01849793 +0000 UTC m=+0.067246212 container died 6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:29:32 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a-userdata-shm.mount: Deactivated successfully.
Nov 28 11:29:32 np0005538960 systemd[1]: var-lib-containers-storage-overlay-12033acb456d86518f29ac2f50aaa635103565f110873f8ccbfb43ce32174413-merged.mount: Deactivated successfully.
Nov 28 11:29:32 np0005538960 ovn_controller[95460]: 2025-11-28T16:29:32Z|00179|binding|INFO|Releasing lport 2cb95f44-afec-455d-89b0-45fb2a803d6d from this chassis (sb_readonly=0)
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.166 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:32 np0005538960 podman[220949]: 2025-11-28 16:29:32.168553451 +0000 UTC m=+0.217301713 container cleanup 6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:29:32 np0005538960 systemd[1]: libpod-conmon-6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a.scope: Deactivated successfully.
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.187 187256 INFO nova.virt.libvirt.driver [-] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Instance destroyed successfully.#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.188 187256 DEBUG nova.objects.instance [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lazy-loading 'resources' on Instance uuid 62d23852-bb8c-4240-a037-86cfa1b3a07c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.208 187256 DEBUG nova.virt.libvirt.vif [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:29:06Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1332009570',display_name='tempest-TestNetworkBasicOps-server-1332009570',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1332009570',id=38,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJQ0RTFDs7EAnXlW6axzbeQzyPz2JX8FpFU+5BzPCu2vdnl062oP4eGavf8/eCy1TDBc3eW1tX7zh69rkuU/5qVDbHgMYHK+wBETL/mOYZZgjsNWcxYMaAfdQcdMxRjmtA==',key_name='tempest-TestNetworkBasicOps-1440030787',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:29:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d9d93d32b42b46fbb1392048dd8941bb',ramdisk_id='',reservation_id='r-03twp9uz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-561116586',owner_user_name='tempest-TestNetworkBasicOps-561116586-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:29:27Z,user_data=None,user_id='a4105532118847f583e4bf7594336693',uuid=62d23852-bb8c-4240-a037-86cfa1b3a07c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.208 187256 DEBUG nova.network.os_vif_util [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converting VIF {"id": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "address": "fa:16:3e:55:96:1d", "network": {"id": "190e04f9-d028-441a-93bf-e8d4ff728b31", "bridge": "br-int", "label": "tempest-network-smoke--810131062", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d9d93d32b42b46fbb1392048dd8941bb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap17fbb695-95", "ovs_interfaceid": "17fbb695-95a2-4ae4-86e1-5ef421f19484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.209 187256 DEBUG nova.network.os_vif_util [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:55:96:1d,bridge_name='br-int',has_traffic_filtering=True,id=17fbb695-95a2-4ae4-86e1-5ef421f19484,network=Network(190e04f9-d028-441a-93bf-e8d4ff728b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap17fbb695-95') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.209 187256 DEBUG os_vif [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:96:1d,bridge_name='br-int',has_traffic_filtering=True,id=17fbb695-95a2-4ae4-86e1-5ef421f19484,network=Network(190e04f9-d028-441a-93bf-e8d4ff728b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap17fbb695-95') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.211 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.211 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap17fbb695-95, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.213 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.214 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.217 187256 INFO os_vif [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:55:96:1d,bridge_name='br-int',has_traffic_filtering=True,id=17fbb695-95a2-4ae4-86e1-5ef421f19484,network=Network(190e04f9-d028-441a-93bf-e8d4ff728b31),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap17fbb695-95')#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.218 187256 INFO nova.virt.libvirt.driver [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Deleting instance files /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c_del#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.218 187256 INFO nova.virt.libvirt.driver [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Deletion of /var/lib/nova/instances/62d23852-bb8c-4240-a037-86cfa1b3a07c_del complete#033[00m
Nov 28 11:29:32 np0005538960 podman[220993]: 2025-11-28 16:29:32.245153281 +0000 UTC m=+0.047924391 container remove 6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.250 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[80680612-8aeb-446b-bb24-6fa2494447f8]: (4, ('Fri Nov 28 04:29:31 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31 (6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a)\n6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a\nFri Nov 28 04:29:32 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31 (6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a)\n6ce14a9f0ab608036fa3f761983e99fbf7a793d829de42517a6b5077fbfcd50a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.252 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1cd44585-2303-49de-92d3-da9d9bd67e84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.253 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap190e04f9-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.255 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:32 np0005538960 kernel: tap190e04f9-d0: left promiscuous mode
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.267 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.270 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9e21baa2-6c16-4fda-80f1-31c17ed770c4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.288 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[8a2892e7-69d2-4cc4-83a6-231fe4fb49fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.290 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[334f1087-df80-4ab6-b6b7-c9c4ee66bf31]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.305 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[03f42e85-0437-4c1f-a50e-993679865e62]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446457, 'reachable_time': 31793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221007, 'error': None, 'target': 'ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.309 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-190e04f9-d028-441a-93bf-e8d4ff728b31 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:29:32 np0005538960 systemd[1]: run-netns-ovnmeta\x2d190e04f9\x2dd028\x2d441a\x2d93bf\x2de8d4ff728b31.mount: Deactivated successfully.
Nov 28 11:29:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:29:32.309 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[ee9c346a-b05d-4792-8e8e-827428c0d022]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.338 187256 INFO nova.compute.manager [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Took 0.54 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.339 187256 DEBUG oslo.service.loopingcall [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.340 187256 DEBUG nova.compute.manager [-] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:29:32 np0005538960 nova_compute[187252]: 2025-11-28 16:29:32.340 187256 DEBUG nova.network.neutron [-] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:29:33 np0005538960 nova_compute[187252]: 2025-11-28 16:29:33.219 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.103 187256 DEBUG nova.compute.manager [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Received event network-vif-unplugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.103 187256 DEBUG oslo_concurrency.lockutils [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.103 187256 DEBUG oslo_concurrency.lockutils [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.103 187256 DEBUG oslo_concurrency.lockutils [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.104 187256 DEBUG nova.compute.manager [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] No waiting events found dispatching network-vif-unplugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.104 187256 DEBUG nova.compute.manager [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Received event network-vif-unplugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.104 187256 DEBUG nova.compute.manager [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Received event network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.104 187256 DEBUG oslo_concurrency.lockutils [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.104 187256 DEBUG oslo_concurrency.lockutils [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.105 187256 DEBUG oslo_concurrency.lockutils [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.105 187256 DEBUG nova.compute.manager [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] No waiting events found dispatching network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.105 187256 WARNING nova.compute.manager [req-77181856-c37f-467c-b15f-b8759c6140e5 req-102e577a-3749-4e12-ad87-f3c8013b9df9 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Received unexpected event network-vif-plugged-17fbb695-95a2-4ae4-86e1-5ef421f19484 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.238 187256 DEBUG nova.network.neutron [-] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.256 187256 INFO nova.compute.manager [-] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Took 1.92 seconds to deallocate network for instance.#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.329 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.330 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.432 187256 DEBUG nova.compute.provider_tree [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.453 187256 DEBUG nova.scheduler.client.report [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.475 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.539 187256 INFO nova.scheduler.client.report [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Deleted allocations for instance 62d23852-bb8c-4240-a037-86cfa1b3a07c#033[00m
Nov 28 11:29:34 np0005538960 nova_compute[187252]: 2025-11-28 16:29:34.621 187256 DEBUG oslo_concurrency.lockutils [None req-ba264eff-1f9e-4e07-967d-568c90d4d15f a4105532118847f583e4bf7594336693 d9d93d32b42b46fbb1392048dd8941bb - - default default] Lock "62d23852-bb8c-4240-a037-86cfa1b3a07c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.826s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:35 np0005538960 podman[221008]: 2025-11-28 16:29:35.172392926 +0000 UTC m=+0.071076326 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:29:36 np0005538960 nova_compute[187252]: 2025-11-28 16:29:36.215 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347361.213297, c4c0b9e1-fd20-4bc8-b105-53c8be08942f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:29:36 np0005538960 nova_compute[187252]: 2025-11-28 16:29:36.216 187256 INFO nova.compute.manager [-] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:29:36 np0005538960 nova_compute[187252]: 2025-11-28 16:29:36.252 187256 DEBUG nova.compute.manager [None req-033c40d6-3006-4892-9403-182320225e96 - - - - - -] [instance: c4c0b9e1-fd20-4bc8-b105-53c8be08942f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:29:37 np0005538960 nova_compute[187252]: 2025-11-28 16:29:37.216 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:38 np0005538960 nova_compute[187252]: 2025-11-28 16:29:38.223 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:41 np0005538960 podman[221032]: 2025-11-28 16:29:41.193404169 +0000 UTC m=+0.097525261 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:29:42 np0005538960 nova_compute[187252]: 2025-11-28 16:29:42.218 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:43 np0005538960 nova_compute[187252]: 2025-11-28 16:29:43.226 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:45 np0005538960 podman[221061]: 2025-11-28 16:29:45.157783983 +0000 UTC m=+0.061543353 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=multipathd)
Nov 28 11:29:45 np0005538960 podman[221062]: 2025-11-28 16:29:45.158000988 +0000 UTC m=+0.056739275 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:29:47 np0005538960 nova_compute[187252]: 2025-11-28 16:29:47.186 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347372.1852083, 62d23852-bb8c-4240-a037-86cfa1b3a07c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:29:47 np0005538960 nova_compute[187252]: 2025-11-28 16:29:47.187 187256 INFO nova.compute.manager [-] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:29:47 np0005538960 nova_compute[187252]: 2025-11-28 16:29:47.223 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:47 np0005538960 nova_compute[187252]: 2025-11-28 16:29:47.236 187256 DEBUG nova.compute.manager [None req-1713896e-688d-4b3c-8ac8-ecf7ab743554 - - - - - -] [instance: 62d23852-bb8c-4240-a037-86cfa1b3a07c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:29:48 np0005538960 nova_compute[187252]: 2025-11-28 16:29:48.228 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:49 np0005538960 podman[221101]: 2025-11-28 16:29:49.146922442 +0000 UTC m=+0.056427717 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:29:52 np0005538960 nova_compute[187252]: 2025-11-28 16:29:52.227 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:53 np0005538960 podman[221126]: 2025-11-28 16:29:53.169294243 +0000 UTC m=+0.072241985 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, config_id=edpm, version=9.6)
Nov 28 11:29:53 np0005538960 nova_compute[187252]: 2025-11-28 16:29:53.279 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.231 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.368 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.369 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.393 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.487 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.487 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.497 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.497 187256 INFO nova.compute.claims [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.654 187256 DEBUG nova.compute.provider_tree [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.673 187256 DEBUG nova.scheduler.client.report [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.700 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.701 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.949 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.950 187256 DEBUG nova.network.neutron [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:29:57 np0005538960 nova_compute[187252]: 2025-11-28 16:29:57.994 187256 INFO nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.077 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.262 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.264 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.265 187256 INFO nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Creating image(s)#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.266 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.267 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.268 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.285 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.287 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.352 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.354 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.355 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.367 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.387 187256 DEBUG nova.policy [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.431 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.433 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.473 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.475 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.475 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.532 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.533 187256 DEBUG nova.virt.disk.api [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Checking if we can resize image /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.533 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.588 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.589 187256 DEBUG nova.virt.disk.api [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Cannot resize image /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.590 187256 DEBUG nova.objects.instance [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'migration_context' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.612 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.613 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Ensure instance console log exists: /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.614 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.614 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:29:58 np0005538960 nova_compute[187252]: 2025-11-28 16:29:58.615 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:00 np0005538960 nova_compute[187252]: 2025-11-28 16:30:00.616 187256 DEBUG nova.network.neutron [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Successfully created port: c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:30:02 np0005538960 podman[221162]: 2025-11-28 16:30:02.157018675 +0000 UTC m=+0.065882899 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Nov 28 11:30:02 np0005538960 nova_compute[187252]: 2025-11-28 16:30:02.273 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:03 np0005538960 nova_compute[187252]: 2025-11-28 16:30:03.284 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:03 np0005538960 nova_compute[187252]: 2025-11-28 16:30:03.933 187256 DEBUG nova.network.neutron [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Successfully updated port: c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:30:03 np0005538960 nova_compute[187252]: 2025-11-28 16:30:03.969 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:30:03 np0005538960 nova_compute[187252]: 2025-11-28 16:30:03.970 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:30:03 np0005538960 nova_compute[187252]: 2025-11-28 16:30:03.970 187256 DEBUG nova.network.neutron [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:30:04 np0005538960 nova_compute[187252]: 2025-11-28 16:30:04.229 187256 DEBUG nova.network.neutron [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.106 187256 DEBUG nova.compute.manager [req-4b492722-c09f-4a84-a0f7-ae46038c60f4 req-1b59e0b0-7fbd-460a-88ce-fb1dd95d4d86 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-changed-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.107 187256 DEBUG nova.compute.manager [req-4b492722-c09f-4a84-a0f7-ae46038c60f4 req-1b59e0b0-7fbd-460a-88ce-fb1dd95d4d86 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Refreshing instance network info cache due to event network-changed-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.107 187256 DEBUG oslo_concurrency.lockutils [req-4b492722-c09f-4a84-a0f7-ae46038c60f4 req-1b59e0b0-7fbd-460a-88ce-fb1dd95d4d86 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.359 187256 DEBUG nova.network.neutron [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updating instance_info_cache with network_info: [{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.378 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.379 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance network_info: |[{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.380 187256 DEBUG oslo_concurrency.lockutils [req-4b492722-c09f-4a84-a0f7-ae46038c60f4 req-1b59e0b0-7fbd-460a-88ce-fb1dd95d4d86 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.380 187256 DEBUG nova.network.neutron [req-4b492722-c09f-4a84-a0f7-ae46038c60f4 req-1b59e0b0-7fbd-460a-88ce-fb1dd95d4d86 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Refreshing network info cache for port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.383 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Start _get_guest_xml network_info=[{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.389 187256 WARNING nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.397 187256 DEBUG nova.virt.libvirt.host [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.397 187256 DEBUG nova.virt.libvirt.host [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.404 187256 DEBUG nova.virt.libvirt.host [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.405 187256 DEBUG nova.virt.libvirt.host [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.406 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.406 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.407 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.407 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.407 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.408 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.408 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.408 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.408 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.409 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.409 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.409 187256 DEBUG nova.virt.hardware [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.414 187256 DEBUG nova.virt.libvirt.vif [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:29:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1382753008',display_name='tempest-TestNetworkAdvancedServerOps-server-1382753008',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1382753008',id=39,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHY6QOEV1JMKbUARcz9utL5DtOLnFnwM2R5sR2I1IkKsD/dQqWZDFD5Y/W9gE6GTadzYOlKYgSEd+LYZ20Nqa9RSL/Fm2PBKcVQxmT4Cy87yXdNA/8990HGivXatNk2Rw==',key_name='tempest-TestNetworkAdvancedServerOps-1858247008',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-284npo52',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:29:58Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=a1b30abb-1dff-48d7-ad1f-d8df62d0c976,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.414 187256 DEBUG nova.network.os_vif_util [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.415 187256 DEBUG nova.network.os_vif_util [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.416 187256 DEBUG nova.objects.instance [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'pci_devices' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.437 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <uuid>a1b30abb-1dff-48d7-ad1f-d8df62d0c976</uuid>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <name>instance-00000027</name>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1382753008</nova:name>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:30:05</nova:creationTime>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        <nova:user uuid="5d381eba17324dd5ad798648b82d0115">tempest-TestNetworkAdvancedServerOps-762685809-project-member</nova:user>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        <nova:project uuid="7e408bace48b41a1ac0677d300b6d288">tempest-TestNetworkAdvancedServerOps-762685809</nova:project>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        <nova:port uuid="c47cc86a-00e6-4adc-a8ce-1ceeb7b72534">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <entry name="serial">a1b30abb-1dff-48d7-ad1f-d8df62d0c976</entry>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <entry name="uuid">a1b30abb-1dff-48d7-ad1f-d8df62d0c976</entry>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.config"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:a8:ff:58"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <target dev="tapc47cc86a-00"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/console.log" append="off"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:30:05 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:30:05 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:30:05 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:30:05 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.438 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Preparing to wait for external event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.438 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.438 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.438 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.439 187256 DEBUG nova.virt.libvirt.vif [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:29:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1382753008',display_name='tempest-TestNetworkAdvancedServerOps-server-1382753008',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1382753008',id=39,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHY6QOEV1JMKbUARcz9utL5DtOLnFnwM2R5sR2I1IkKsD/dQqWZDFD5Y/W9gE6GTadzYOlKYgSEd+LYZ20Nqa9RSL/Fm2PBKcVQxmT4Cy87yXdNA/8990HGivXatNk2Rw==',key_name='tempest-TestNetworkAdvancedServerOps-1858247008',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-284npo52',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:29:58Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=a1b30abb-1dff-48d7-ad1f-d8df62d0c976,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.439 187256 DEBUG nova.network.os_vif_util [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.440 187256 DEBUG nova.network.os_vif_util [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.440 187256 DEBUG os_vif [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.441 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.441 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.442 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.446 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.447 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc47cc86a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.448 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc47cc86a-00, col_values=(('external_ids', {'iface-id': 'c47cc86a-00e6-4adc-a8ce-1ceeb7b72534', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:ff:58', 'vm-uuid': 'a1b30abb-1dff-48d7-ad1f-d8df62d0c976'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.450 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:05 np0005538960 NetworkManager[55548]: <info>  [1764347405.4517] manager: (tapc47cc86a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.452 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:30:05 np0005538960 podman[221182]: 2025-11-28 16:30:05.452451515 +0000 UTC m=+0.063071260 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.458 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.459 187256 INFO os_vif [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00')#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.510 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.510 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.511 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No VIF found with MAC fa:16:3e:a8:ff:58, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:30:05 np0005538960 nova_compute[187252]: 2025-11-28 16:30:05.511 187256 INFO nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Using config drive#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.237 187256 INFO nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Creating config drive at /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.config#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.241 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz4ed1527 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.350 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.350 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.351 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.373 187256 DEBUG oslo_concurrency.processutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz4ed1527" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:30:06 np0005538960 kernel: tapc47cc86a-00: entered promiscuous mode
Nov 28 11:30:06 np0005538960 NetworkManager[55548]: <info>  [1764347406.4566] manager: (tapc47cc86a-00): new Tun device (/org/freedesktop/NetworkManager/Devices/96)
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.455 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:06Z|00180|binding|INFO|Claiming lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for this chassis.
Nov 28 11:30:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:06Z|00181|binding|INFO|c47cc86a-00e6-4adc-a8ce-1ceeb7b72534: Claiming fa:16:3e:a8:ff:58 10.100.0.11
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.461 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.474 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:ff:58 10.100.0.11'], port_security=['fa:16:3e:a8:ff:58 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a1b30abb-1dff-48d7-ad1f-d8df62d0c976', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157b052-23af-40b0-bd07-76246d08673a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '2', 'neutron:security_group_ids': '5c027794-de2f-43a4-b849-302cec02015a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a298a2e7-adfa-4eff-8538-b5d3421dd91e, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.476 104369 INFO neutron.agent.ovn.metadata.agent [-] Port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 in datapath c157b052-23af-40b0-bd07-76246d08673a bound to our chassis#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.477 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c157b052-23af-40b0-bd07-76246d08673a#033[00m
Nov 28 11:30:06 np0005538960 systemd-udevd[221227]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.489 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[831512e6-5d0c-4c94-afc0-11186469cedd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.490 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc157b052-21 in ovnmeta-c157b052-23af-40b0-bd07-76246d08673a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:30:06 np0005538960 systemd-machined[153518]: New machine qemu-14-instance-00000027.
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.496 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc157b052-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.497 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[8da8ad12-8189-4bf1-90b8-0350b9bf6568]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.499 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[83d89b5b-4108-4b29-8799-7a22e651e7fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 NetworkManager[55548]: <info>  [1764347406.5053] device (tapc47cc86a-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:30:06 np0005538960 NetworkManager[55548]: <info>  [1764347406.5063] device (tapc47cc86a-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.513 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[7474bc85-4e45-4caf-9b45-b2b7bc33ee05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.516 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:06 np0005538960 systemd[1]: Started Virtual Machine qemu-14-instance-00000027.
Nov 28 11:30:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:06Z|00182|binding|INFO|Setting lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 ovn-installed in OVS
Nov 28 11:30:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:06Z|00183|binding|INFO|Setting lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 up in Southbound
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.524 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.531 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[eb246520-4c46-4383-bc5c-72ed3fde1ae6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.562 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[ba704656-2414-42b1-8986-6c874cf60130]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 NetworkManager[55548]: <info>  [1764347406.5688] manager: (tapc157b052-20): new Veth device (/org/freedesktop/NetworkManager/Devices/97)
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.567 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[71e801a6-9578-467e-abdc-0af0e6e3e225]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 systemd-udevd[221231]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.601 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[4035908a-46ac-465b-ad0b-1c780546f15c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.605 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[59c7ad5d-7e40-4176-ab9e-5acf58bd8f18]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 NetworkManager[55548]: <info>  [1764347406.6313] device (tapc157b052-20): carrier: link connected
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.637 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[4e9726ac-d238-4bba-8ee7-4d6485cfc904]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.656 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[32286779-56cb-4128-9ac7-09fa6064f77d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157b052-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:df:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450621, 'reachable_time': 28903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221260, 'error': None, 'target': 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.674 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[aafaa993-14be-4995-b914-f6dc227f1a20]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:df9b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 450621, 'tstamp': 450621}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221261, 'error': None, 'target': 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.697 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[82591da1-d7a1-444c-a65e-354066ad6b06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157b052-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:df:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450621, 'reachable_time': 28903, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221262, 'error': None, 'target': 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.730 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a97fb9-92f3-43f3-8ac0-b32a42fce6b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.801 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[de62cb50-c37e-4c20-b663-d14e925063ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.808 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157b052-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.808 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.809 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc157b052-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:06 np0005538960 kernel: tapc157b052-20: entered promiscuous mode
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.812 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:06 np0005538960 NetworkManager[55548]: <info>  [1764347406.8143] manager: (tapc157b052-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.815 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc157b052-20, col_values=(('external_ids', {'iface-id': '64711e8e-e597-4016-a4d5-c6f7d7c85c3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:06 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:06Z|00184|binding|INFO|Releasing lport 64711e8e-e597-4016-a4d5-c6f7d7c85c3a from this chassis (sb_readonly=0)
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.842 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.843 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c157b052-23af-40b0-bd07-76246d08673a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c157b052-23af-40b0-bd07-76246d08673a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.844 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9f813e6d-ffa1-4b24-887b-ee5f88efc45a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.845 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-c157b052-23af-40b0-bd07-76246d08673a
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/c157b052-23af-40b0-bd07-76246d08673a.pid.haproxy
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID c157b052-23af-40b0-bd07-76246d08673a
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:30:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:06.847 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'env', 'PROCESS_TAG=haproxy-c157b052-23af-40b0-bd07-76246d08673a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c157b052-23af-40b0-bd07-76246d08673a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.864 187256 DEBUG nova.compute.manager [req-89bccdb2-b2fa-4d1e-9e04-21ced24494b5 req-aee31a74-0d88-476f-ba10-f6e950f105d3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.865 187256 DEBUG oslo_concurrency.lockutils [req-89bccdb2-b2fa-4d1e-9e04-21ced24494b5 req-aee31a74-0d88-476f-ba10-f6e950f105d3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.865 187256 DEBUG oslo_concurrency.lockutils [req-89bccdb2-b2fa-4d1e-9e04-21ced24494b5 req-aee31a74-0d88-476f-ba10-f6e950f105d3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.865 187256 DEBUG oslo_concurrency.lockutils [req-89bccdb2-b2fa-4d1e-9e04-21ced24494b5 req-aee31a74-0d88-476f-ba10-f6e950f105d3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.865 187256 DEBUG nova.compute.manager [req-89bccdb2-b2fa-4d1e-9e04-21ced24494b5 req-aee31a74-0d88-476f-ba10-f6e950f105d3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Processing event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.989 187256 DEBUG nova.network.neutron [req-4b492722-c09f-4a84-a0f7-ae46038c60f4 req-1b59e0b0-7fbd-460a-88ce-fb1dd95d4d86 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updated VIF entry in instance network info cache for port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:30:06 np0005538960 nova_compute[187252]: 2025-11-28 16:30:06.990 187256 DEBUG nova.network.neutron [req-4b492722-c09f-4a84-a0f7-ae46038c60f4 req-1b59e0b0-7fbd-460a-88ce-fb1dd95d4d86 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updating instance_info_cache with network_info: [{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.009 187256 DEBUG oslo_concurrency.lockutils [req-4b492722-c09f-4a84-a0f7-ae46038c60f4 req-1b59e0b0-7fbd-460a-88ce-fb1dd95d4d86 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.014 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347407.0141351, a1b30abb-1dff-48d7-ad1f-d8df62d0c976 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.015 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] VM Started (Lifecycle Event)#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.017 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.021 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.026 187256 INFO nova.virt.libvirt.driver [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance spawned successfully.#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.027 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.036 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.040 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.047 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.047 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.047 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.048 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.048 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.048 187256 DEBUG nova.virt.libvirt.driver [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.057 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.057 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347407.014448, a1b30abb-1dff-48d7-ad1f-d8df62d0c976 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.057 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.105 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.109 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347407.0206606, a1b30abb-1dff-48d7-ad1f-d8df62d0c976 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.110 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.126 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.130 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.153 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:30:07 np0005538960 podman[221301]: 2025-11-28 16:30:07.232564766 +0000 UTC m=+0.053902146 container create 8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 11:30:07 np0005538960 systemd[1]: Started libpod-conmon-8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418.scope.
Nov 28 11:30:07 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:30:07 np0005538960 podman[221301]: 2025-11-28 16:30:07.204581013 +0000 UTC m=+0.025918393 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:30:07 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/824e55f3c3b732459647597f41512882f76507902906876b66e23a1e6d898d64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:30:07 np0005538960 podman[221301]: 2025-11-28 16:30:07.31917676 +0000 UTC m=+0.140514150 container init 8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:30:07 np0005538960 podman[221301]: 2025-11-28 16:30:07.325220497 +0000 UTC m=+0.146557867 container start 8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:30:07 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221316]: [NOTICE]   (221320) : New worker (221322) forked
Nov 28 11:30:07 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221316]: [NOTICE]   (221320) : Loading success.
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.663 187256 INFO nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Took 9.40 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.664 187256 DEBUG nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.731 187256 INFO nova.compute.manager [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Took 10.28 seconds to build instance.#033[00m
Nov 28 11:30:07 np0005538960 nova_compute[187252]: 2025-11-28 16:30:07.756 187256 DEBUG oslo_concurrency.lockutils [None req-6fde9a4e-b59b-4af4-b96e-18d6fb559fba 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:08 np0005538960 nova_compute[187252]: 2025-11-28 16:30:08.287 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:10 np0005538960 nova_compute[187252]: 2025-11-28 16:30:10.450 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:11 np0005538960 nova_compute[187252]: 2025-11-28 16:30:11.320 187256 DEBUG nova.compute.manager [req-6835695c-6d4f-4733-acbc-2dd5149e146c req-fe8ad66c-a27c-42d4-9d50-16a6b8bb374d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:11 np0005538960 nova_compute[187252]: 2025-11-28 16:30:11.320 187256 DEBUG oslo_concurrency.lockutils [req-6835695c-6d4f-4733-acbc-2dd5149e146c req-fe8ad66c-a27c-42d4-9d50-16a6b8bb374d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:11 np0005538960 nova_compute[187252]: 2025-11-28 16:30:11.320 187256 DEBUG oslo_concurrency.lockutils [req-6835695c-6d4f-4733-acbc-2dd5149e146c req-fe8ad66c-a27c-42d4-9d50-16a6b8bb374d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:11 np0005538960 nova_compute[187252]: 2025-11-28 16:30:11.321 187256 DEBUG oslo_concurrency.lockutils [req-6835695c-6d4f-4733-acbc-2dd5149e146c req-fe8ad66c-a27c-42d4-9d50-16a6b8bb374d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:11 np0005538960 nova_compute[187252]: 2025-11-28 16:30:11.321 187256 DEBUG nova.compute.manager [req-6835695c-6d4f-4733-acbc-2dd5149e146c req-fe8ad66c-a27c-42d4-9d50-16a6b8bb374d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] No waiting events found dispatching network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:30:11 np0005538960 nova_compute[187252]: 2025-11-28 16:30:11.321 187256 WARNING nova.compute.manager [req-6835695c-6d4f-4733-acbc-2dd5149e146c req-fe8ad66c-a27c-42d4-9d50-16a6b8bb374d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received unexpected event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:30:12 np0005538960 podman[221331]: 2025-11-28 16:30:12.199484436 +0000 UTC m=+0.101703502 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:30:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:13.176 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:30:13 np0005538960 nova_compute[187252]: 2025-11-28 16:30:13.176 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:13.178 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:30:13 np0005538960 nova_compute[187252]: 2025-11-28 16:30:13.289 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:14 np0005538960 NetworkManager[55548]: <info>  [1764347414.5707] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Nov 28 11:30:14 np0005538960 nova_compute[187252]: 2025-11-28 16:30:14.571 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:14 np0005538960 NetworkManager[55548]: <info>  [1764347414.5717] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Nov 28 11:30:14 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:14Z|00185|binding|INFO|Releasing lport 64711e8e-e597-4016-a4d5-c6f7d7c85c3a from this chassis (sb_readonly=0)
Nov 28 11:30:14 np0005538960 nova_compute[187252]: 2025-11-28 16:30:14.687 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:14 np0005538960 nova_compute[187252]: 2025-11-28 16:30:14.697 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:15 np0005538960 nova_compute[187252]: 2025-11-28 16:30:15.453 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:15 np0005538960 podman[221361]: 2025-11-28 16:30:15.542346254 +0000 UTC m=+0.068067932 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:30:15 np0005538960 podman[221362]: 2025-11-28 16:30:15.566816331 +0000 UTC m=+0.086084521 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 11:30:15 np0005538960 nova_compute[187252]: 2025-11-28 16:30:15.611 187256 DEBUG nova.compute.manager [req-fbc8f624-9117-4b73-b795-ec9d2725f9bf req-f0d0666d-9805-47cf-a753-6d58f2b86194 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-changed-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:15 np0005538960 nova_compute[187252]: 2025-11-28 16:30:15.611 187256 DEBUG nova.compute.manager [req-fbc8f624-9117-4b73-b795-ec9d2725f9bf req-f0d0666d-9805-47cf-a753-6d58f2b86194 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Refreshing instance network info cache due to event network-changed-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:30:15 np0005538960 nova_compute[187252]: 2025-11-28 16:30:15.611 187256 DEBUG oslo_concurrency.lockutils [req-fbc8f624-9117-4b73-b795-ec9d2725f9bf req-f0d0666d-9805-47cf-a753-6d58f2b86194 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:30:15 np0005538960 nova_compute[187252]: 2025-11-28 16:30:15.612 187256 DEBUG oslo_concurrency.lockutils [req-fbc8f624-9117-4b73-b795-ec9d2725f9bf req-f0d0666d-9805-47cf-a753-6d58f2b86194 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:30:15 np0005538960 nova_compute[187252]: 2025-11-28 16:30:15.612 187256 DEBUG nova.network.neutron [req-fbc8f624-9117-4b73-b795-ec9d2725f9bf req-f0d0666d-9805-47cf-a753-6d58f2b86194 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Refreshing network info cache for port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:30:18 np0005538960 nova_compute[187252]: 2025-11-28 16:30:18.294 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:18Z|00186|binding|INFO|Releasing lport 64711e8e-e597-4016-a4d5-c6f7d7c85c3a from this chassis (sb_readonly=0)
Nov 28 11:30:18 np0005538960 nova_compute[187252]: 2025-11-28 16:30:18.709 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:19 np0005538960 nova_compute[187252]: 2025-11-28 16:30:19.234 187256 DEBUG nova.network.neutron [req-fbc8f624-9117-4b73-b795-ec9d2725f9bf req-f0d0666d-9805-47cf-a753-6d58f2b86194 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updated VIF entry in instance network info cache for port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:30:19 np0005538960 nova_compute[187252]: 2025-11-28 16:30:19.235 187256 DEBUG nova.network.neutron [req-fbc8f624-9117-4b73-b795-ec9d2725f9bf req-f0d0666d-9805-47cf-a753-6d58f2b86194 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updating instance_info_cache with network_info: [{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:30:19 np0005538960 nova_compute[187252]: 2025-11-28 16:30:19.280 187256 DEBUG oslo_concurrency.lockutils [req-fbc8f624-9117-4b73-b795-ec9d2725f9bf req-f0d0666d-9805-47cf-a753-6d58f2b86194 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:30:20 np0005538960 podman[221416]: 2025-11-28 16:30:20.159278471 +0000 UTC m=+0.061673936 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:30:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:20.180 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:20 np0005538960 nova_compute[187252]: 2025-11-28 16:30:20.455 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:20Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a8:ff:58 10.100.0.11
Nov 28 11:30:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:20Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:ff:58 10.100.0.11
Nov 28 11:30:21 np0005538960 nova_compute[187252]: 2025-11-28 16:30:21.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:21 np0005538960 nova_compute[187252]: 2025-11-28 16:30:21.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:30:23 np0005538960 nova_compute[187252]: 2025-11-28 16:30:23.294 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:23 np0005538960 nova_compute[187252]: 2025-11-28 16:30:23.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:24 np0005538960 podman[221442]: 2025-11-28 16:30:24.247027097 +0000 UTC m=+0.070593593 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc.)
Nov 28 11:30:24 np0005538960 nova_compute[187252]: 2025-11-28 16:30:24.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:25 np0005538960 nova_compute[187252]: 2025-11-28 16:30:25.458 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.569 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.570 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.570 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.570 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.620 187256 INFO nova.compute.manager [None req-f18ed7dc-40da-4bb3-bdf6-4974135b3107 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Get console output#033[00m
Nov 28 11:30:26 np0005538960 nova_compute[187252]: 2025-11-28 16:30:26.626 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:30:27 np0005538960 nova_compute[187252]: 2025-11-28 16:30:27.004 187256 DEBUG oslo_concurrency.lockutils [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:27 np0005538960 nova_compute[187252]: 2025-11-28 16:30:27.005 187256 DEBUG oslo_concurrency.lockutils [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:27 np0005538960 nova_compute[187252]: 2025-11-28 16:30:27.005 187256 DEBUG nova.compute.manager [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:27 np0005538960 nova_compute[187252]: 2025-11-28 16:30:27.009 187256 DEBUG nova.compute.manager [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Nov 28 11:30:27 np0005538960 nova_compute[187252]: 2025-11-28 16:30:27.011 187256 DEBUG nova.objects.instance [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'flavor' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:27 np0005538960 nova_compute[187252]: 2025-11-28 16:30:27.034 187256 DEBUG nova.virt.libvirt.driver [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.297 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.451 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updating instance_info_cache with network_info: [{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.469 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.469 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.471 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.471 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.471 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.472 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.504 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.505 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.505 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.505 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.573 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.635 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.636 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.695 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.867 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.868 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5551MB free_disk=73.3093032836914GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.869 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.869 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.948 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.949 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.949 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:30:28 np0005538960 nova_compute[187252]: 2025-11-28 16:30:28.996 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.013 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.036 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.036 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:29 np0005538960 kernel: tapc47cc86a-00 (unregistering): left promiscuous mode
Nov 28 11:30:29 np0005538960 NetworkManager[55548]: <info>  [1764347429.2247] device (tapc47cc86a-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:30:29 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:29Z|00187|binding|INFO|Releasing lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 from this chassis (sb_readonly=0)
Nov 28 11:30:29 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:29Z|00188|binding|INFO|Setting lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 down in Southbound
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.236 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:29 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:29Z|00189|binding|INFO|Removing iface tapc47cc86a-00 ovn-installed in OVS
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.239 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.246 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:ff:58 10.100.0.11'], port_security=['fa:16:3e:a8:ff:58 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a1b30abb-1dff-48d7-ad1f-d8df62d0c976', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157b052-23af-40b0-bd07-76246d08673a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '4', 'neutron:security_group_ids': '5c027794-de2f-43a4-b849-302cec02015a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a298a2e7-adfa-4eff-8538-b5d3421dd91e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.248 104369 INFO neutron.agent.ovn.metadata.agent [-] Port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 in datapath c157b052-23af-40b0-bd07-76246d08673a unbound from our chassis#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.249 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c157b052-23af-40b0-bd07-76246d08673a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.251 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[64ac6f77-50e8-4836-87d2-5569e560ddc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.252 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c157b052-23af-40b0-bd07-76246d08673a namespace which is not needed anymore#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.255 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:29 np0005538960 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000027.scope: Deactivated successfully.
Nov 28 11:30:29 np0005538960 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000027.scope: Consumed 14.314s CPU time.
Nov 28 11:30:29 np0005538960 systemd-machined[153518]: Machine qemu-14-instance-00000027 terminated.
Nov 28 11:30:29 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221316]: [NOTICE]   (221320) : haproxy version is 2.8.14-c23fe91
Nov 28 11:30:29 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221316]: [NOTICE]   (221320) : path to executable is /usr/sbin/haproxy
Nov 28 11:30:29 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221316]: [WARNING]  (221320) : Exiting Master process...
Nov 28 11:30:29 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221316]: [WARNING]  (221320) : Exiting Master process...
Nov 28 11:30:29 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221316]: [ALERT]    (221320) : Current worker (221322) exited with code 143 (Terminated)
Nov 28 11:30:29 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221316]: [WARNING]  (221320) : All workers exited. Exiting... (0)
Nov 28 11:30:29 np0005538960 systemd[1]: libpod-8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418.scope: Deactivated successfully.
Nov 28 11:30:29 np0005538960 podman[221495]: 2025-11-28 16:30:29.408145476 +0000 UTC m=+0.049754474 container died 8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:30:29 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418-userdata-shm.mount: Deactivated successfully.
Nov 28 11:30:29 np0005538960 systemd[1]: var-lib-containers-storage-overlay-824e55f3c3b732459647597f41512882f76507902906876b66e23a1e6d898d64-merged.mount: Deactivated successfully.
Nov 28 11:30:29 np0005538960 podman[221495]: 2025-11-28 16:30:29.462218356 +0000 UTC m=+0.103827364 container cleanup 8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 11:30:29 np0005538960 systemd[1]: libpod-conmon-8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418.scope: Deactivated successfully.
Nov 28 11:30:29 np0005538960 podman[221537]: 2025-11-28 16:30:29.53079759 +0000 UTC m=+0.044384194 container remove 8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.537 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b40d3360-5a1e-4973-85a7-7685690dccf7]: (4, ('Fri Nov 28 04:30:29 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a (8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418)\n8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418\nFri Nov 28 04:30:29 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a (8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418)\n8043850ac66812203714c7e8e579db1bc51034dcd1c1f8fc7234e14247c6c418\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.539 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe8dd8e-ce6a-4e32-839c-170f83422967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.541 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157b052-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.544 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:29 np0005538960 kernel: tapc157b052-20: left promiscuous mode
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.560 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.561 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.564 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[346dc2c7-bb24-43d0-a4d0-5fd9732ebe46]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.578 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5aac8662-b85f-4a0f-8449-7836946da915]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.580 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2e94b8-f812-4429-9bdf-139a9aa595e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.599 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c45d8ea6-7e27-411f-819a-7bf14cd5ce5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 450613, 'reachable_time': 22834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221563, 'error': None, 'target': 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.603 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c157b052-23af-40b0-bd07-76246d08673a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:30:29 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:29.603 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[bbd9dd47-f2af-4a48-8b91-7a4d9e2888e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:29 np0005538960 systemd[1]: run-netns-ovnmeta\x2dc157b052\x2d23af\x2d40b0\x2dbd07\x2d76246d08673a.mount: Deactivated successfully.
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.611 187256 DEBUG nova.compute.manager [req-72dfc3e4-4af7-40ff-9561-21082a5cc99c req-2205e6d9-3943-46a8-b582-fc9c852191bb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-unplugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.612 187256 DEBUG oslo_concurrency.lockutils [req-72dfc3e4-4af7-40ff-9561-21082a5cc99c req-2205e6d9-3943-46a8-b582-fc9c852191bb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.612 187256 DEBUG oslo_concurrency.lockutils [req-72dfc3e4-4af7-40ff-9561-21082a5cc99c req-2205e6d9-3943-46a8-b582-fc9c852191bb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.612 187256 DEBUG oslo_concurrency.lockutils [req-72dfc3e4-4af7-40ff-9561-21082a5cc99c req-2205e6d9-3943-46a8-b582-fc9c852191bb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.613 187256 DEBUG nova.compute.manager [req-72dfc3e4-4af7-40ff-9561-21082a5cc99c req-2205e6d9-3943-46a8-b582-fc9c852191bb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] No waiting events found dispatching network-vif-unplugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:30:29 np0005538960 nova_compute[187252]: 2025-11-28 16:30:29.613 187256 WARNING nova.compute.manager [req-72dfc3e4-4af7-40ff-9561-21082a5cc99c req-2205e6d9-3943-46a8-b582-fc9c852191bb 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received unexpected event network-vif-unplugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for instance with vm_state active and task_state powering-off.#033[00m
Nov 28 11:30:30 np0005538960 nova_compute[187252]: 2025-11-28 16:30:30.032 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:30:30 np0005538960 nova_compute[187252]: 2025-11-28 16:30:30.051 187256 INFO nova.virt.libvirt.driver [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance shutdown successfully after 3 seconds.#033[00m
Nov 28 11:30:30 np0005538960 nova_compute[187252]: 2025-11-28 16:30:30.057 187256 INFO nova.virt.libvirt.driver [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance destroyed successfully.#033[00m
Nov 28 11:30:30 np0005538960 nova_compute[187252]: 2025-11-28 16:30:30.058 187256 DEBUG nova.objects.instance [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'numa_topology' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:30 np0005538960 nova_compute[187252]: 2025-11-28 16:30:30.070 187256 DEBUG nova.compute.manager [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:30 np0005538960 nova_compute[187252]: 2025-11-28 16:30:30.142 187256 DEBUG oslo_concurrency.lockutils [None req-69a6c812-443b-43e6-a375-af6156676761 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:30 np0005538960 nova_compute[187252]: 2025-11-28 16:30:30.461 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:31 np0005538960 nova_compute[187252]: 2025-11-28 16:30:31.704 187256 DEBUG nova.compute.manager [req-9f5037d7-cbca-458e-99f1-fe0f7b79a0c1 req-e7884961-db89-4d35-a762-f48cf3676aae 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:31 np0005538960 nova_compute[187252]: 2025-11-28 16:30:31.705 187256 DEBUG oslo_concurrency.lockutils [req-9f5037d7-cbca-458e-99f1-fe0f7b79a0c1 req-e7884961-db89-4d35-a762-f48cf3676aae 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:31 np0005538960 nova_compute[187252]: 2025-11-28 16:30:31.705 187256 DEBUG oslo_concurrency.lockutils [req-9f5037d7-cbca-458e-99f1-fe0f7b79a0c1 req-e7884961-db89-4d35-a762-f48cf3676aae 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:31 np0005538960 nova_compute[187252]: 2025-11-28 16:30:31.705 187256 DEBUG oslo_concurrency.lockutils [req-9f5037d7-cbca-458e-99f1-fe0f7b79a0c1 req-e7884961-db89-4d35-a762-f48cf3676aae 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:31 np0005538960 nova_compute[187252]: 2025-11-28 16:30:31.706 187256 DEBUG nova.compute.manager [req-9f5037d7-cbca-458e-99f1-fe0f7b79a0c1 req-e7884961-db89-4d35-a762-f48cf3676aae 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] No waiting events found dispatching network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:30:31 np0005538960 nova_compute[187252]: 2025-11-28 16:30:31.706 187256 WARNING nova.compute.manager [req-9f5037d7-cbca-458e-99f1-fe0f7b79a0c1 req-e7884961-db89-4d35-a762-f48cf3676aae 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received unexpected event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for instance with vm_state stopped and task_state None.#033[00m
Nov 28 11:30:33 np0005538960 podman[221564]: 2025-11-28 16:30:33.165081089 +0000 UTC m=+0.069561748 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 28 11:30:33 np0005538960 nova_compute[187252]: 2025-11-28 16:30:33.342 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:34 np0005538960 nova_compute[187252]: 2025-11-28 16:30:34.302 187256 INFO nova.compute.manager [None req-e437f219-5a1c-4009-a20c-25fdbeab6dab 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Get console output#033[00m
Nov 28 11:30:34 np0005538960 nova_compute[187252]: 2025-11-28 16:30:34.492 187256 DEBUG nova.objects.instance [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'flavor' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:34 np0005538960 nova_compute[187252]: 2025-11-28 16:30:34.519 187256 DEBUG oslo_concurrency.lockutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:30:34 np0005538960 nova_compute[187252]: 2025-11-28 16:30:34.520 187256 DEBUG oslo_concurrency.lockutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:30:34 np0005538960 nova_compute[187252]: 2025-11-28 16:30:34.520 187256 DEBUG nova.network.neutron [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:30:34 np0005538960 nova_compute[187252]: 2025-11-28 16:30:34.520 187256 DEBUG nova.objects.instance [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'info_cache' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.314 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'a1b30abb-1dff-48d7-ad1f-d8df62d0c976', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1382753008', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000027', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '7e408bace48b41a1ac0677d300b6d288', 'user_id': '5d381eba17324dd5ad798648b82d0115', 'hostId': 'cd100fb82324667bbd51f1f77fa64a7bda5828863300b4bb1bb03a6d', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.315 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.316 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.316 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.317 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.317 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1382753008>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1382753008>]
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.317 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.318 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.318 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.319 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.319 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.319 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.320 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.320 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.321 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.321 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.323 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.323 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.323 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.323 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1382753008>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1382753008>]
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.323 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.324 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.324 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.325 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.325 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.325 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.326 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.327 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.327 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.328 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.329 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.330 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.330 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.330 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1382753008>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1382753008>]
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.330 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.331 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.331 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.331 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.332 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.332 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.333 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.334 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.334 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.334 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.334 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1382753008>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1382753008>]
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.335 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 11:30:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:30:35.335 12 DEBUG ceilometer.compute.pollsters [-] Instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-00000027, id=a1b30abb-1dff-48d7-ad1f-d8df62d0c976>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Nov 28 11:30:35 np0005538960 nova_compute[187252]: 2025-11-28 16:30:35.465 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:35 np0005538960 nova_compute[187252]: 2025-11-28 16:30:35.801 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:35 np0005538960 nova_compute[187252]: 2025-11-28 16:30:35.950 187256 DEBUG nova.network.neutron [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updating instance_info_cache with network_info: [{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:30:35 np0005538960 nova_compute[187252]: 2025-11-28 16:30:35.966 187256 DEBUG oslo_concurrency.lockutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:30:35 np0005538960 nova_compute[187252]: 2025-11-28 16:30:35.989 187256 INFO nova.virt.libvirt.driver [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance destroyed successfully.#033[00m
Nov 28 11:30:35 np0005538960 nova_compute[187252]: 2025-11-28 16:30:35.990 187256 DEBUG nova.objects.instance [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'numa_topology' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:35.999 187256 DEBUG nova.objects.instance [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'resources' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.013 187256 DEBUG nova.virt.libvirt.vif [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:29:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1382753008',display_name='tempest-TestNetworkAdvancedServerOps-server-1382753008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1382753008',id=39,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHY6QOEV1JMKbUARcz9utL5DtOLnFnwM2R5sR2I1IkKsD/dQqWZDFD5Y/W9gE6GTadzYOlKYgSEd+LYZ20Nqa9RSL/Fm2PBKcVQxmT4Cy87yXdNA/8990HGivXatNk2Rw==',key_name='tempest-TestNetworkAdvancedServerOps-1858247008',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:30:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-284npo52',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:30:30Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=a1b30abb-1dff-48d7-ad1f-d8df62d0c976,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.014 187256 DEBUG nova.network.os_vif_util [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.014 187256 DEBUG nova.network.os_vif_util [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.015 187256 DEBUG os_vif [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.016 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.017 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc47cc86a-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.018 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.019 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.022 187256 INFO os_vif [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00')#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.029 187256 DEBUG nova.virt.libvirt.driver [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Start _get_guest_xml network_info=[{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.033 187256 WARNING nova.virt.libvirt.driver [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.038 187256 DEBUG nova.virt.libvirt.host [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.040 187256 DEBUG nova.virt.libvirt.host [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.044 187256 DEBUG nova.virt.libvirt.host [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.045 187256 DEBUG nova.virt.libvirt.host [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.046 187256 DEBUG nova.virt.libvirt.driver [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.046 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.047 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.047 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.047 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.047 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.048 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.048 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.048 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.048 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.048 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.049 187256 DEBUG nova.virt.hardware [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.049 187256 DEBUG nova.objects.instance [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.062 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.132 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.config --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.133 187256 DEBUG oslo_concurrency.lockutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.134 187256 DEBUG oslo_concurrency.lockutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.135 187256 DEBUG oslo_concurrency.lockutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.136 187256 DEBUG nova.virt.libvirt.vif [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:29:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1382753008',display_name='tempest-TestNetworkAdvancedServerOps-server-1382753008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1382753008',id=39,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHY6QOEV1JMKbUARcz9utL5DtOLnFnwM2R5sR2I1IkKsD/dQqWZDFD5Y/W9gE6GTadzYOlKYgSEd+LYZ20Nqa9RSL/Fm2PBKcVQxmT4Cy87yXdNA/8990HGivXatNk2Rw==',key_name='tempest-TestNetworkAdvancedServerOps-1858247008',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:30:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-284npo52',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:30:30Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=a1b30abb-1dff-48d7-ad1f-d8df62d0c976,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.137 187256 DEBUG nova.network.os_vif_util [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.138 187256 DEBUG nova.network.os_vif_util [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.139 187256 DEBUG nova.objects.instance [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'pci_devices' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:36 np0005538960 podman[221585]: 2025-11-28 16:30:36.15525382 +0000 UTC m=+0.059424371 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.156 187256 DEBUG nova.virt.libvirt.driver [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <uuid>a1b30abb-1dff-48d7-ad1f-d8df62d0c976</uuid>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <name>instance-00000027</name>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1382753008</nova:name>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:30:36</nova:creationTime>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        <nova:user uuid="5d381eba17324dd5ad798648b82d0115">tempest-TestNetworkAdvancedServerOps-762685809-project-member</nova:user>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        <nova:project uuid="7e408bace48b41a1ac0677d300b6d288">tempest-TestNetworkAdvancedServerOps-762685809</nova:project>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        <nova:port uuid="c47cc86a-00e6-4adc-a8ce-1ceeb7b72534">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <entry name="serial">a1b30abb-1dff-48d7-ad1f-d8df62d0c976</entry>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <entry name="uuid">a1b30abb-1dff-48d7-ad1f-d8df62d0c976</entry>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk.config"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:a8:ff:58"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <target dev="tapc47cc86a-00"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/console.log" append="off"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <input type="keyboard" bus="usb"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:30:36 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:30:36 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:30:36 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:30:36 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.158 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.224 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.225 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.288 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.290 187256 DEBUG nova.objects.instance [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'trusted_certs' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.304 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.364 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.365 187256 DEBUG nova.virt.disk.api [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Checking if we can resize image /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.366 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.429 187256 DEBUG oslo_concurrency.processutils [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.431 187256 DEBUG nova.virt.disk.api [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Cannot resize image /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.431 187256 DEBUG nova.objects.instance [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'migration_context' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.446 187256 DEBUG nova.virt.libvirt.vif [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:29:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1382753008',display_name='tempest-TestNetworkAdvancedServerOps-server-1382753008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1382753008',id=39,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHY6QOEV1JMKbUARcz9utL5DtOLnFnwM2R5sR2I1IkKsD/dQqWZDFD5Y/W9gE6GTadzYOlKYgSEd+LYZ20Nqa9RSL/Fm2PBKcVQxmT4Cy87yXdNA/8990HGivXatNk2Rw==',key_name='tempest-TestNetworkAdvancedServerOps-1858247008',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:30:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-284npo52',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:30:30Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=a1b30abb-1dff-48d7-ad1f-d8df62d0c976,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.447 187256 DEBUG nova.network.os_vif_util [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.449 187256 DEBUG nova.network.os_vif_util [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.449 187256 DEBUG os_vif [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.450 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.450 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.451 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.454 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.454 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc47cc86a-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.454 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc47cc86a-00, col_values=(('external_ids', {'iface-id': 'c47cc86a-00e6-4adc-a8ce-1ceeb7b72534', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a8:ff:58', 'vm-uuid': 'a1b30abb-1dff-48d7-ad1f-d8df62d0c976'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.456 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 NetworkManager[55548]: <info>  [1764347436.4572] manager: (tapc47cc86a-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.461 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.461 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.462 187256 INFO os_vif [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00')#033[00m
Nov 28 11:30:36 np0005538960 kernel: tapc47cc86a-00: entered promiscuous mode
Nov 28 11:30:36 np0005538960 NetworkManager[55548]: <info>  [1764347436.5464] manager: (tapc47cc86a-00): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Nov 28 11:30:36 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:36Z|00190|binding|INFO|Claiming lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for this chassis.
Nov 28 11:30:36 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:36Z|00191|binding|INFO|c47cc86a-00e6-4adc-a8ce-1ceeb7b72534: Claiming fa:16:3e:a8:ff:58 10.100.0.11
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.548 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.557 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:ff:58 10.100.0.11'], port_security=['fa:16:3e:a8:ff:58 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a1b30abb-1dff-48d7-ad1f-d8df62d0c976', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157b052-23af-40b0-bd07-76246d08673a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '5', 'neutron:security_group_ids': '5c027794-de2f-43a4-b849-302cec02015a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a298a2e7-adfa-4eff-8538-b5d3421dd91e, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.558 104369 INFO neutron.agent.ovn.metadata.agent [-] Port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 in datapath c157b052-23af-40b0-bd07-76246d08673a bound to our chassis#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.559 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c157b052-23af-40b0-bd07-76246d08673a#033[00m
Nov 28 11:30:36 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:36Z|00192|binding|INFO|Setting lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 ovn-installed in OVS
Nov 28 11:30:36 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:36Z|00193|binding|INFO|Setting lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 up in Southbound
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.564 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.568 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.571 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.571 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fe2523e6-27a3-44b4-93f9-4d4f7312992f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.573 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc157b052-21 in ovnmeta-c157b052-23af-40b0-bd07-76246d08673a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.575 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc157b052-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.575 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[28c55c1d-dfbc-4643-abc8-5272c23361d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.576 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f32c83d3-ed46-4b5d-b158-879271d2bfb1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 systemd-udevd[221639]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.592 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[866ee016-dea0-496f-829f-e52a2a9db772]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 NetworkManager[55548]: <info>  [1764347436.5970] device (tapc47cc86a-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:30:36 np0005538960 NetworkManager[55548]: <info>  [1764347436.5981] device (tapc47cc86a-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:30:36 np0005538960 systemd-machined[153518]: New machine qemu-15-instance-00000027.
Nov 28 11:30:36 np0005538960 systemd[1]: Started Virtual Machine qemu-15-instance-00000027.
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.618 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e65a776d-9b54-4a37-9064-2d57ccdd9801]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.652 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[39bf7961-2254-4266-aff4-c570e3646753]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 systemd-udevd[221643]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.660 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c50f8248-26e8-42c9-b714-ff7c6054bf6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 NetworkManager[55548]: <info>  [1764347436.6621] manager: (tapc157b052-20): new Veth device (/org/freedesktop/NetworkManager/Devices/103)
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.696 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[d964d28c-c20a-4a6f-bd9f-1e144bbab66a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.699 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[12107ae0-ac6b-4aa4-a645-e9c15f999e27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 NetworkManager[55548]: <info>  [1764347436.7287] device (tapc157b052-20): carrier: link connected
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.736 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[5431100f-09e9-42eb-ab73-7aec851e5cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.755 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1d5e2b6e-c25a-4aaa-b0ed-5ab46f879a96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157b052-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:df:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453630, 'reachable_time': 37537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221672, 'error': None, 'target': 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.771 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f062e6ec-bd29-4a48-9dd4-1d51d7fb412a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:df9b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 453630, 'tstamp': 453630}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221673, 'error': None, 'target': 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.785 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e3280e93-79df-4aa0-8fc6-24ff5d62a1bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc157b052-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:df:9b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453630, 'reachable_time': 37537, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221674, 'error': None, 'target': 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.822 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e24bc1ae-6eee-4796-8e80-bbe902e352bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.887 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[38aa2baf-ed4e-457e-8d96-34789016f272]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.889 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157b052-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.890 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.890 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc157b052-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.892 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 kernel: tapc157b052-20: entered promiscuous mode
Nov 28 11:30:36 np0005538960 NetworkManager[55548]: <info>  [1764347436.8935] manager: (tapc157b052-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.895 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.896 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc157b052-20, col_values=(('external_ids', {'iface-id': '64711e8e-e597-4016-a4d5-c6f7d7c85c3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.897 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:36Z|00194|binding|INFO|Releasing lport 64711e8e-e597-4016-a4d5-c6f7d7c85c3a from this chassis (sb_readonly=0)
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.898 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.899 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c157b052-23af-40b0-bd07-76246d08673a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c157b052-23af-40b0-bd07-76246d08673a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.900 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3e92bc93-2e4a-460f-98d8-6c054d70c569]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.900 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-c157b052-23af-40b0-bd07-76246d08673a
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/c157b052-23af-40b0-bd07-76246d08673a.pid.haproxy
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID c157b052-23af-40b0-bd07-76246d08673a
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:30:36 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:36.902 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'env', 'PROCESS_TAG=haproxy-c157b052-23af-40b0-bd07-76246d08673a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c157b052-23af-40b0-bd07-76246d08673a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.909 187256 DEBUG nova.compute.manager [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.910 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.912 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Removed pending event for a1b30abb-1dff-48d7-ad1f-d8df62d0c976 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.912 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347436.9105778, a1b30abb-1dff-48d7-ad1f-d8df62d0c976 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.912 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.918 187256 INFO nova.virt.libvirt.driver [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance rebooted successfully.#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.918 187256 DEBUG nova.compute.manager [None req-e8880366-2481-4dcf-8e33-84c77b4f47cd 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.929 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.932 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.949 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.949 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347436.9112847, a1b30abb-1dff-48d7-ad1f-d8df62d0c976 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.949 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] VM Started (Lifecycle Event)#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.970 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:30:36 np0005538960 nova_compute[187252]: 2025-11-28 16:30:36.977 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Synchronizing instance power state after lifecycle event "Started"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:30:37 np0005538960 podman[221713]: 2025-11-28 16:30:37.301885142 +0000 UTC m=+0.068245787 container create ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:30:37 np0005538960 nova_compute[187252]: 2025-11-28 16:30:37.336 187256 DEBUG nova.compute.manager [req-31c55e3e-2ace-4c54-bdcd-12421866c879 req-697cec86-d15b-44fc-8de5-548b7f835d92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:37 np0005538960 nova_compute[187252]: 2025-11-28 16:30:37.337 187256 DEBUG oslo_concurrency.lockutils [req-31c55e3e-2ace-4c54-bdcd-12421866c879 req-697cec86-d15b-44fc-8de5-548b7f835d92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:37 np0005538960 nova_compute[187252]: 2025-11-28 16:30:37.337 187256 DEBUG oslo_concurrency.lockutils [req-31c55e3e-2ace-4c54-bdcd-12421866c879 req-697cec86-d15b-44fc-8de5-548b7f835d92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:37 np0005538960 nova_compute[187252]: 2025-11-28 16:30:37.338 187256 DEBUG oslo_concurrency.lockutils [req-31c55e3e-2ace-4c54-bdcd-12421866c879 req-697cec86-d15b-44fc-8de5-548b7f835d92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:37 np0005538960 nova_compute[187252]: 2025-11-28 16:30:37.338 187256 DEBUG nova.compute.manager [req-31c55e3e-2ace-4c54-bdcd-12421866c879 req-697cec86-d15b-44fc-8de5-548b7f835d92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] No waiting events found dispatching network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:30:37 np0005538960 nova_compute[187252]: 2025-11-28 16:30:37.338 187256 WARNING nova.compute.manager [req-31c55e3e-2ace-4c54-bdcd-12421866c879 req-697cec86-d15b-44fc-8de5-548b7f835d92 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received unexpected event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:30:37 np0005538960 systemd[1]: Started libpod-conmon-ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8.scope.
Nov 28 11:30:37 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:30:37 np0005538960 podman[221713]: 2025-11-28 16:30:37.274220066 +0000 UTC m=+0.040580731 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:30:37 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fbece4ffaa7a33aaafb9dd80bbd9deb754cec939974bbb58f0f96bd85f9e4fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:30:37 np0005538960 podman[221713]: 2025-11-28 16:30:37.386403094 +0000 UTC m=+0.152763749 container init ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:30:37 np0005538960 podman[221713]: 2025-11-28 16:30:37.392725619 +0000 UTC m=+0.159086264 container start ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 11:30:37 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221728]: [NOTICE]   (221732) : New worker (221734) forked
Nov 28 11:30:37 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221728]: [NOTICE]   (221732) : Loading success.
Nov 28 11:30:38 np0005538960 nova_compute[187252]: 2025-11-28 16:30:38.345 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:39 np0005538960 nova_compute[187252]: 2025-11-28 16:30:39.424 187256 DEBUG nova.compute.manager [req-370e4b30-a943-4d18-a326-bd1e1c1922b6 req-2f1d5494-23b4-4911-b4f7-7dd858e3d909 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:39 np0005538960 nova_compute[187252]: 2025-11-28 16:30:39.424 187256 DEBUG oslo_concurrency.lockutils [req-370e4b30-a943-4d18-a326-bd1e1c1922b6 req-2f1d5494-23b4-4911-b4f7-7dd858e3d909 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:39 np0005538960 nova_compute[187252]: 2025-11-28 16:30:39.425 187256 DEBUG oslo_concurrency.lockutils [req-370e4b30-a943-4d18-a326-bd1e1c1922b6 req-2f1d5494-23b4-4911-b4f7-7dd858e3d909 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:39 np0005538960 nova_compute[187252]: 2025-11-28 16:30:39.425 187256 DEBUG oslo_concurrency.lockutils [req-370e4b30-a943-4d18-a326-bd1e1c1922b6 req-2f1d5494-23b4-4911-b4f7-7dd858e3d909 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:39 np0005538960 nova_compute[187252]: 2025-11-28 16:30:39.425 187256 DEBUG nova.compute.manager [req-370e4b30-a943-4d18-a326-bd1e1c1922b6 req-2f1d5494-23b4-4911-b4f7-7dd858e3d909 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] No waiting events found dispatching network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:30:39 np0005538960 nova_compute[187252]: 2025-11-28 16:30:39.425 187256 WARNING nova.compute.manager [req-370e4b30-a943-4d18-a326-bd1e1c1922b6 req-2f1d5494-23b4-4911-b4f7-7dd858e3d909 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received unexpected event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:30:40 np0005538960 nova_compute[187252]: 2025-11-28 16:30:40.326 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:41 np0005538960 nova_compute[187252]: 2025-11-28 16:30:41.458 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:43 np0005538960 podman[221744]: 2025-11-28 16:30:43.224826372 +0000 UTC m=+0.130250849 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:30:43 np0005538960 nova_compute[187252]: 2025-11-28 16:30:43.348 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:45 np0005538960 nova_compute[187252]: 2025-11-28 16:30:45.761 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:46 np0005538960 podman[221770]: 2025-11-28 16:30:46.163497847 +0000 UTC m=+0.066853543 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:30:46 np0005538960 podman[221771]: 2025-11-28 16:30:46.169861352 +0000 UTC m=+0.069749043 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 11:30:46 np0005538960 nova_compute[187252]: 2025-11-28 16:30:46.460 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:48 np0005538960 nova_compute[187252]: 2025-11-28 16:30:48.350 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:48 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:48.961 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:18:b2:ed 2001:db8:0:1:f816:3eff:fe18:b2ed 2001:db8::f816:3eff:fe18:b2ed'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe18:b2ed/64 2001:db8::f816:3eff:fe18:b2ed/64', 'neutron:device_id': 'ovnmeta-f347c0ca-92f1-45be-a878-7d23544020cb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f347c0ca-92f1-45be-a878-7d23544020cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b4f4f29-c9cf-4537-9f34-0e57d8325dca, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a9646896-83c6-4a72-97f9-c6d052889af6) old=Port_Binding(mac=['fa:16:3e:18:b2:ed 2001:db8::f816:3eff:fe18:b2ed'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe18:b2ed/64', 'neutron:device_id': 'ovnmeta-f347c0ca-92f1-45be-a878-7d23544020cb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f347c0ca-92f1-45be-a878-7d23544020cb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:30:48 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:48.962 104369 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a9646896-83c6-4a72-97f9-c6d052889af6 in datapath f347c0ca-92f1-45be-a878-7d23544020cb updated#033[00m
Nov 28 11:30:48 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:48.964 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f347c0ca-92f1-45be-a878-7d23544020cb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:30:48 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:48.965 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4afa26-27e5-48e6-ac53-79437559f084]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:49 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:49Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a8:ff:58 10.100.0.11
Nov 28 11:30:51 np0005538960 podman[221820]: 2025-11-28 16:30:51.162281445 +0000 UTC m=+0.065928490 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:30:51 np0005538960 nova_compute[187252]: 2025-11-28 16:30:51.464 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:53 np0005538960 nova_compute[187252]: 2025-11-28 16:30:53.352 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:55 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:55Z|00195|binding|INFO|Releasing lport 64711e8e-e597-4016-a4d5-c6f7d7c85c3a from this chassis (sb_readonly=0)
Nov 28 11:30:55 np0005538960 nova_compute[187252]: 2025-11-28 16:30:55.099 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:55 np0005538960 nova_compute[187252]: 2025-11-28 16:30:55.179 187256 INFO nova.compute.manager [None req-43fa09b9-3346-46e0-9ce6-19201682533f 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Get console output#033[00m
Nov 28 11:30:55 np0005538960 nova_compute[187252]: 2025-11-28 16:30:55.186 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:30:55 np0005538960 podman[221845]: 2025-11-28 16:30:55.184886492 +0000 UTC m=+0.065023097 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, config_id=edpm, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 11:30:56 np0005538960 nova_compute[187252]: 2025-11-28 16:30:56.467 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:56.999 187256 DEBUG nova.compute.manager [req-cfe406b4-1dea-47c8-ad77-73860c9e0031 req-d795dcd1-d35f-4672-a329-c3a958ab8c0d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-changed-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.000 187256 DEBUG nova.compute.manager [req-cfe406b4-1dea-47c8-ad77-73860c9e0031 req-d795dcd1-d35f-4672-a329-c3a958ab8c0d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Refreshing instance network info cache due to event network-changed-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.000 187256 DEBUG oslo_concurrency.lockutils [req-cfe406b4-1dea-47c8-ad77-73860c9e0031 req-d795dcd1-d35f-4672-a329-c3a958ab8c0d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.000 187256 DEBUG oslo_concurrency.lockutils [req-cfe406b4-1dea-47c8-ad77-73860c9e0031 req-d795dcd1-d35f-4672-a329-c3a958ab8c0d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.001 187256 DEBUG nova.network.neutron [req-cfe406b4-1dea-47c8-ad77-73860c9e0031 req-d795dcd1-d35f-4672-a329-c3a958ab8c0d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Refreshing network info cache for port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.449 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.450 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.450 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.451 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.451 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.452 187256 INFO nova.compute.manager [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Terminating instance#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.453 187256 DEBUG nova.compute.manager [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:30:57 np0005538960 kernel: tapc47cc86a-00 (unregistering): left promiscuous mode
Nov 28 11:30:57 np0005538960 NetworkManager[55548]: <info>  [1764347457.4796] device (tapc47cc86a-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.487 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:57 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:57Z|00196|binding|INFO|Releasing lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 from this chassis (sb_readonly=0)
Nov 28 11:30:57 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:57Z|00197|binding|INFO|Setting lport c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 down in Southbound
Nov 28 11:30:57 np0005538960 ovn_controller[95460]: 2025-11-28T16:30:57Z|00198|binding|INFO|Removing iface tapc47cc86a-00 ovn-installed in OVS
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.492 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.508 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.548 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:ff:58 10.100.0.11'], port_security=['fa:16:3e:a8:ff:58 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a1b30abb-1dff-48d7-ad1f-d8df62d0c976', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c157b052-23af-40b0-bd07-76246d08673a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5c027794-de2f-43a4-b849-302cec02015a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a298a2e7-adfa-4eff-8538-b5d3421dd91e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:30:57 np0005538960 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000027.scope: Deactivated successfully.
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.549 104369 INFO neutron.agent.ovn.metadata.agent [-] Port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 in datapath c157b052-23af-40b0-bd07-76246d08673a unbound from our chassis#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.550 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c157b052-23af-40b0-bd07-76246d08673a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:30:57 np0005538960 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000027.scope: Consumed 14.004s CPU time.
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.551 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[84286d76-165e-4741-8a67-b54293a47cdb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.552 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c157b052-23af-40b0-bd07-76246d08673a namespace which is not needed anymore#033[00m
Nov 28 11:30:57 np0005538960 systemd-machined[153518]: Machine qemu-15-instance-00000027 terminated.
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.682 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.688 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:57 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221728]: [NOTICE]   (221732) : haproxy version is 2.8.14-c23fe91
Nov 28 11:30:57 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221728]: [NOTICE]   (221732) : path to executable is /usr/sbin/haproxy
Nov 28 11:30:57 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221728]: [WARNING]  (221732) : Exiting Master process...
Nov 28 11:30:57 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221728]: [WARNING]  (221732) : Exiting Master process...
Nov 28 11:30:57 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221728]: [ALERT]    (221732) : Current worker (221734) exited with code 143 (Terminated)
Nov 28 11:30:57 np0005538960 neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a[221728]: [WARNING]  (221732) : All workers exited. Exiting... (0)
Nov 28 11:30:57 np0005538960 systemd[1]: libpod-ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8.scope: Deactivated successfully.
Nov 28 11:30:57 np0005538960 podman[221891]: 2025-11-28 16:30:57.714692229 +0000 UTC m=+0.059108434 container died ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.723 187256 INFO nova.virt.libvirt.driver [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Instance destroyed successfully.#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.724 187256 DEBUG nova.objects.instance [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'resources' on Instance uuid a1b30abb-1dff-48d7-ad1f-d8df62d0c976 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:30:57 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8-userdata-shm.mount: Deactivated successfully.
Nov 28 11:30:57 np0005538960 systemd[1]: var-lib-containers-storage-overlay-8fbece4ffaa7a33aaafb9dd80bbd9deb754cec939974bbb58f0f96bd85f9e4fd-merged.mount: Deactivated successfully.
Nov 28 11:30:57 np0005538960 podman[221891]: 2025-11-28 16:30:57.759511122 +0000 UTC m=+0.103927327 container cleanup ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:30:57 np0005538960 systemd[1]: libpod-conmon-ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8.scope: Deactivated successfully.
Nov 28 11:30:57 np0005538960 podman[221935]: 2025-11-28 16:30:57.835853576 +0000 UTC m=+0.050512954 container remove ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.842 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1c90b387-c6ee-4a2a-97c0-de94f105ea38]: (4, ('Fri Nov 28 04:30:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a (ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8)\nad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8\nFri Nov 28 04:30:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c157b052-23af-40b0-bd07-76246d08673a (ad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8)\nad9c6553e767dd324ad579cfa1e8e54e6a51d190f235f4d6629764dcde50ddf8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.844 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[4f50a3d5-fbe2-4aca-a512-ec410cb75fb9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.846 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc157b052-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.848 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:57 np0005538960 kernel: tapc157b052-20: left promiscuous mode
Nov 28 11:30:57 np0005538960 nova_compute[187252]: 2025-11-28 16:30:57.865 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.868 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[4849ee8a-f645-4de0-9380-aa8090e2f801]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.888 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[cedb24ac-a831-44e5-91bb-782e046fdbd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.891 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc24489-45f5-4378-ba65-9ce2339e7f51]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.907 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[15c9d5e1-9597-401a-ab3d-6cabc08e3257]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 453622, 'reachable_time': 16668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221954, 'error': None, 'target': 'ovnmeta-c157b052-23af-40b0-bd07-76246d08673a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.909 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c157b052-23af-40b0-bd07-76246d08673a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:30:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:30:57.910 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[053a5ef2-10d5-427c-861b-a06eb40ce978]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:30:57 np0005538960 systemd[1]: run-netns-ovnmeta\x2dc157b052\x2d23af\x2d40b0\x2dbd07\x2d76246d08673a.mount: Deactivated successfully.
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.180 187256 DEBUG nova.virt.libvirt.vif [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:29:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1382753008',display_name='tempest-TestNetworkAdvancedServerOps-server-1382753008',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1382753008',id=39,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJHY6QOEV1JMKbUARcz9utL5DtOLnFnwM2R5sR2I1IkKsD/dQqWZDFD5Y/W9gE6GTadzYOlKYgSEd+LYZ20Nqa9RSL/Fm2PBKcVQxmT4Cy87yXdNA/8990HGivXatNk2Rw==',key_name='tempest-TestNetworkAdvancedServerOps-1858247008',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:30:07Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-284npo52',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:30:36Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=a1b30abb-1dff-48d7-ad1f-d8df62d0c976,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.181 187256 DEBUG nova.network.os_vif_util [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.181 187256 DEBUG nova.network.os_vif_util [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.182 187256 DEBUG os_vif [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.184 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.185 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc47cc86a-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.186 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.188 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.190 187256 INFO os_vif [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a8:ff:58,bridge_name='br-int',has_traffic_filtering=True,id=c47cc86a-00e6-4adc-a8ce-1ceeb7b72534,network=Network(c157b052-23af-40b0-bd07-76246d08673a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc47cc86a-00')#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.190 187256 INFO nova.virt.libvirt.driver [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Deleting instance files /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976_del#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.191 187256 INFO nova.virt.libvirt.driver [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Deletion of /var/lib/nova/instances/a1b30abb-1dff-48d7-ad1f-d8df62d0c976_del complete#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.354 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.701 187256 INFO nova.compute.manager [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Took 1.25 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.702 187256 DEBUG oslo.service.loopingcall [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.702 187256 DEBUG nova.compute.manager [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:30:58 np0005538960 nova_compute[187252]: 2025-11-28 16:30:58.703 187256 DEBUG nova.network.neutron [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:30:59 np0005538960 nova_compute[187252]: 2025-11-28 16:30:59.234 187256 DEBUG nova.compute.manager [req-8de67c65-2510-43f9-9ee1-c4a8af95e040 req-a9de2fc8-d9f4-4872-9a16-648ef1931ca0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-unplugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:30:59 np0005538960 nova_compute[187252]: 2025-11-28 16:30:59.234 187256 DEBUG oslo_concurrency.lockutils [req-8de67c65-2510-43f9-9ee1-c4a8af95e040 req-a9de2fc8-d9f4-4872-9a16-648ef1931ca0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:30:59 np0005538960 nova_compute[187252]: 2025-11-28 16:30:59.234 187256 DEBUG oslo_concurrency.lockutils [req-8de67c65-2510-43f9-9ee1-c4a8af95e040 req-a9de2fc8-d9f4-4872-9a16-648ef1931ca0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:30:59 np0005538960 nova_compute[187252]: 2025-11-28 16:30:59.235 187256 DEBUG oslo_concurrency.lockutils [req-8de67c65-2510-43f9-9ee1-c4a8af95e040 req-a9de2fc8-d9f4-4872-9a16-648ef1931ca0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:30:59 np0005538960 nova_compute[187252]: 2025-11-28 16:30:59.235 187256 DEBUG nova.compute.manager [req-8de67c65-2510-43f9-9ee1-c4a8af95e040 req-a9de2fc8-d9f4-4872-9a16-648ef1931ca0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] No waiting events found dispatching network-vif-unplugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:30:59 np0005538960 nova_compute[187252]: 2025-11-28 16:30:59.235 187256 DEBUG nova.compute.manager [req-8de67c65-2510-43f9-9ee1-c4a8af95e040 req-a9de2fc8-d9f4-4872-9a16-648ef1931ca0 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-unplugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.087 187256 DEBUG nova.network.neutron [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.140 187256 INFO nova.compute.manager [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Took 1.44 seconds to deallocate network for instance.#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.174 187256 DEBUG nova.compute.manager [req-ed25777e-040a-4793-9268-007acde7f331 req-674161c2-c91a-4a69-9da2-2b0cc8e14d2a 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-deleted-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.193 187256 DEBUG nova.network.neutron [req-cfe406b4-1dea-47c8-ad77-73860c9e0031 req-d795dcd1-d35f-4672-a329-c3a958ab8c0d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updated VIF entry in instance network info cache for port c47cc86a-00e6-4adc-a8ce-1ceeb7b72534. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.194 187256 DEBUG nova.network.neutron [req-cfe406b4-1dea-47c8-ad77-73860c9e0031 req-d795dcd1-d35f-4672-a329-c3a958ab8c0d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Updating instance_info_cache with network_info: [{"id": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "address": "fa:16:3e:a8:ff:58", "network": {"id": "c157b052-23af-40b0-bd07-76246d08673a", "bridge": "br-int", "label": "tempest-network-smoke--222866476", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc47cc86a-00", "ovs_interfaceid": "c47cc86a-00e6-4adc-a8ce-1ceeb7b72534", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.251 187256 DEBUG oslo_concurrency.lockutils [req-cfe406b4-1dea-47c8-ad77-73860c9e0031 req-d795dcd1-d35f-4672-a329-c3a958ab8c0d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-a1b30abb-1dff-48d7-ad1f-d8df62d0c976" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.279 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.280 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.381 187256 DEBUG nova.compute.provider_tree [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.396 187256 DEBUG nova.scheduler.client.report [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.420 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.475 187256 INFO nova.scheduler.client.report [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Deleted allocations for instance a1b30abb-1dff-48d7-ad1f-d8df62d0c976#033[00m
Nov 28 11:31:00 np0005538960 nova_compute[187252]: 2025-11-28 16:31:00.555 187256 DEBUG oslo_concurrency.lockutils [None req-2386440a-6ca0-4f7c-8aae-2a4ba53d6755 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.105s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:01 np0005538960 nova_compute[187252]: 2025-11-28 16:31:01.385 187256 DEBUG nova.compute.manager [req-ab670bdb-8adf-4e5b-8eec-ad5b345d86cf req-e0d08764-8084-455c-bbc7-132f2c35e85d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:31:01 np0005538960 nova_compute[187252]: 2025-11-28 16:31:01.386 187256 DEBUG oslo_concurrency.lockutils [req-ab670bdb-8adf-4e5b-8eec-ad5b345d86cf req-e0d08764-8084-455c-bbc7-132f2c35e85d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:01 np0005538960 nova_compute[187252]: 2025-11-28 16:31:01.387 187256 DEBUG oslo_concurrency.lockutils [req-ab670bdb-8adf-4e5b-8eec-ad5b345d86cf req-e0d08764-8084-455c-bbc7-132f2c35e85d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:01 np0005538960 nova_compute[187252]: 2025-11-28 16:31:01.387 187256 DEBUG oslo_concurrency.lockutils [req-ab670bdb-8adf-4e5b-8eec-ad5b345d86cf req-e0d08764-8084-455c-bbc7-132f2c35e85d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "a1b30abb-1dff-48d7-ad1f-d8df62d0c976-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:01 np0005538960 nova_compute[187252]: 2025-11-28 16:31:01.387 187256 DEBUG nova.compute.manager [req-ab670bdb-8adf-4e5b-8eec-ad5b345d86cf req-e0d08764-8084-455c-bbc7-132f2c35e85d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] No waiting events found dispatching network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:31:01 np0005538960 nova_compute[187252]: 2025-11-28 16:31:01.388 187256 WARNING nova.compute.manager [req-ab670bdb-8adf-4e5b-8eec-ad5b345d86cf req-e0d08764-8084-455c-bbc7-132f2c35e85d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Received unexpected event network-vif-plugged-c47cc86a-00e6-4adc-a8ce-1ceeb7b72534 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:31:01 np0005538960 nova_compute[187252]: 2025-11-28 16:31:01.524 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:03 np0005538960 nova_compute[187252]: 2025-11-28 16:31:03.187 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:03 np0005538960 nova_compute[187252]: 2025-11-28 16:31:03.358 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:04 np0005538960 podman[221957]: 2025-11-28 16:31:04.180045767 +0000 UTC m=+0.075305369 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:31:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:06.350 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:06.351 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:06.351 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:06 np0005538960 nova_compute[187252]: 2025-11-28 16:31:06.570 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:06 np0005538960 nova_compute[187252]: 2025-11-28 16:31:06.749 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:07 np0005538960 podman[221978]: 2025-11-28 16:31:07.185017358 +0000 UTC m=+0.093778010 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:31:08 np0005538960 nova_compute[187252]: 2025-11-28 16:31:08.189 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:08 np0005538960 nova_compute[187252]: 2025-11-28 16:31:08.359 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:12 np0005538960 nova_compute[187252]: 2025-11-28 16:31:12.722 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347457.7203343, a1b30abb-1dff-48d7-ad1f-d8df62d0c976 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:31:12 np0005538960 nova_compute[187252]: 2025-11-28 16:31:12.723 187256 INFO nova.compute.manager [-] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:31:12 np0005538960 nova_compute[187252]: 2025-11-28 16:31:12.747 187256 DEBUG nova.compute.manager [None req-6c545d8a-0cd4-463f-bbe0-fe1e1fe0738d - - - - - -] [instance: a1b30abb-1dff-48d7-ad1f-d8df62d0c976] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:31:13 np0005538960 nova_compute[187252]: 2025-11-28 16:31:13.190 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:13 np0005538960 nova_compute[187252]: 2025-11-28 16:31:13.361 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:13.371 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:31:13 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:13.372 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:31:13 np0005538960 nova_compute[187252]: 2025-11-28 16:31:13.372 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:14 np0005538960 podman[222004]: 2025-11-28 16:31:14.216381049 +0000 UTC m=+0.118352650 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 11:31:17 np0005538960 podman[222031]: 2025-11-28 16:31:17.153963596 +0000 UTC m=+0.055720421 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:31:17 np0005538960 podman[222030]: 2025-11-28 16:31:17.159623154 +0000 UTC m=+0.065037568 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 11:31:18 np0005538960 nova_compute[187252]: 2025-11-28 16:31:18.192 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:18 np0005538960 nova_compute[187252]: 2025-11-28 16:31:18.363 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:20 np0005538960 nova_compute[187252]: 2025-11-28 16:31:20.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:22 np0005538960 podman[222068]: 2025-11-28 16:31:22.158409613 +0000 UTC m=+0.062209949 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:31:22 np0005538960 nova_compute[187252]: 2025-11-28 16:31:22.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:22 np0005538960 nova_compute[187252]: 2025-11-28 16:31:22.331 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:31:22 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:22.374 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:31:23 np0005538960 nova_compute[187252]: 2025-11-28 16:31:23.233 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:23 np0005538960 nova_compute[187252]: 2025-11-28 16:31:23.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:23 np0005538960 nova_compute[187252]: 2025-11-28 16:31:23.365 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:24 np0005538960 nova_compute[187252]: 2025-11-28 16:31:24.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:26 np0005538960 podman[222092]: 2025-11-28 16:31:26.146148969 +0000 UTC m=+0.057727173 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=minimal rhel9)
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.341 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.342 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.366 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.367 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.367 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.367 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.563 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.565 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5757MB free_disk=73.33789825439453GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.566 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.566 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.676 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.677 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.727 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.748 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.788 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:31:26 np0005538960 nova_compute[187252]: 2025-11-28 16:31:26.788 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.222s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.762 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.763 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.838 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.838 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.859 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.970 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.971 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.978 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:31:27 np0005538960 nova_compute[187252]: 2025-11-28 16:31:27.978 187256 INFO nova.compute.claims [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.091 187256 DEBUG nova.compute.provider_tree [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.109 187256 DEBUG nova.scheduler.client.report [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.135 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.164s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.137 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.187 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.187 187256 DEBUG nova.network.neutron [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.210 187256 INFO nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.234 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.237 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.347 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.348 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.349 187256 INFO nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Creating image(s)#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.350 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "/var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.350 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.351 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "/var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.365 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.383 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.428 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.429 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.430 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.444 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.490 187256 DEBUG nova.policy [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5d381eba17324dd5ad798648b82d0115', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7e408bace48b41a1ac0677d300b6d288', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.512 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.513 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.554 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.556 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.556 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.617 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.618 187256 DEBUG nova.virt.disk.api [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Checking if we can resize image /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.619 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.680 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.682 187256 DEBUG nova.virt.disk.api [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Cannot resize image /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.682 187256 DEBUG nova.objects.instance [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'migration_context' on Instance uuid 88ea9985-1aae-41bc-b36b-f2cfcc70a818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.745 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.746 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Ensure instance console log exists: /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.747 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.747 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:28 np0005538960 nova_compute[187252]: 2025-11-28 16:31:28.747 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:29 np0005538960 nova_compute[187252]: 2025-11-28 16:31:29.680 187256 DEBUG nova.network.neutron [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Successfully created port: ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:31:30 np0005538960 nova_compute[187252]: 2025-11-28 16:31:30.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:30 np0005538960 nova_compute[187252]: 2025-11-28 16:31:30.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:30 np0005538960 nova_compute[187252]: 2025-11-28 16:31:30.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 11:31:30 np0005538960 nova_compute[187252]: 2025-11-28 16:31:30.338 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 11:31:30 np0005538960 nova_compute[187252]: 2025-11-28 16:31:30.339 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:31:30 np0005538960 nova_compute[187252]: 2025-11-28 16:31:30.339 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 11:31:31 np0005538960 nova_compute[187252]: 2025-11-28 16:31:31.053 187256 DEBUG nova.network.neutron [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Successfully updated port: ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:31:31 np0005538960 nova_compute[187252]: 2025-11-28 16:31:31.073 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:31:31 np0005538960 nova_compute[187252]: 2025-11-28 16:31:31.074 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:31:31 np0005538960 nova_compute[187252]: 2025-11-28 16:31:31.074 187256 DEBUG nova.network.neutron [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:31:31 np0005538960 nova_compute[187252]: 2025-11-28 16:31:31.176 187256 DEBUG nova.compute.manager [req-ce6aa205-3f3c-40ad-9234-8a3565fa57d0 req-2f4cc8eb-758c-484f-874e-6f06ae3c2a11 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-changed-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:31:31 np0005538960 nova_compute[187252]: 2025-11-28 16:31:31.176 187256 DEBUG nova.compute.manager [req-ce6aa205-3f3c-40ad-9234-8a3565fa57d0 req-2f4cc8eb-758c-484f-874e-6f06ae3c2a11 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Refreshing instance network info cache due to event network-changed-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:31:31 np0005538960 nova_compute[187252]: 2025-11-28 16:31:31.177 187256 DEBUG oslo_concurrency.lockutils [req-ce6aa205-3f3c-40ad-9234-8a3565fa57d0 req-2f4cc8eb-758c-484f-874e-6f06ae3c2a11 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:31:31 np0005538960 nova_compute[187252]: 2025-11-28 16:31:31.355 187256 DEBUG nova.network.neutron [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.463 187256 DEBUG nova.network.neutron [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updating instance_info_cache with network_info: [{"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.645 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.646 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Instance network_info: |[{"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.647 187256 DEBUG oslo_concurrency.lockutils [req-ce6aa205-3f3c-40ad-9234-8a3565fa57d0 req-2f4cc8eb-758c-484f-874e-6f06ae3c2a11 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.648 187256 DEBUG nova.network.neutron [req-ce6aa205-3f3c-40ad-9234-8a3565fa57d0 req-2f4cc8eb-758c-484f-874e-6f06ae3c2a11 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Refreshing network info cache for port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.651 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Start _get_guest_xml network_info=[{"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.656 187256 WARNING nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.665 187256 DEBUG nova.virt.libvirt.host [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.667 187256 DEBUG nova.virt.libvirt.host [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.672 187256 DEBUG nova.virt.libvirt.host [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.674 187256 DEBUG nova.virt.libvirt.host [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.675 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.675 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.676 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.676 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.676 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.676 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.677 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.677 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.677 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.677 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.678 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.678 187256 DEBUG nova.virt.hardware [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.684 187256 DEBUG nova.virt.libvirt.vif [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1949144222',display_name='tempest-TestNetworkAdvancedServerOps-server-1949144222',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1949144222',id=43,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNX0V1ZWqLkvjqknjHue7eMDjmU57Kyi77YE+oInmP6qNV/QL9X1483/QWccYbgRcBJxP6wOD3EZHA4fgPCGYTsYgxlD9MQiFEHV3r0YinxX/QpdEknYPmrXIJ/dQd9U0Q==',key_name='tempest-TestNetworkAdvancedServerOps-1862960466',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-kv5ts5xj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:31:28Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=88ea9985-1aae-41bc-b36b-f2cfcc70a818,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.684 187256 DEBUG nova.network.os_vif_util [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.685 187256 DEBUG nova.network.os_vif_util [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.686 187256 DEBUG nova.objects.instance [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'pci_devices' on Instance uuid 88ea9985-1aae-41bc-b36b-f2cfcc70a818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.700 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <uuid>88ea9985-1aae-41bc-b36b-f2cfcc70a818</uuid>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <name>instance-0000002b</name>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1949144222</nova:name>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:31:32</nova:creationTime>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        <nova:user uuid="5d381eba17324dd5ad798648b82d0115">tempest-TestNetworkAdvancedServerOps-762685809-project-member</nova:user>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        <nova:project uuid="7e408bace48b41a1ac0677d300b6d288">tempest-TestNetworkAdvancedServerOps-762685809</nova:project>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        <nova:port uuid="ca1c4039-2c03-41ff-ab95-67b86f6e0ee9">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <entry name="serial">88ea9985-1aae-41bc-b36b-f2cfcc70a818</entry>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <entry name="uuid">88ea9985-1aae-41bc-b36b-f2cfcc70a818</entry>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk.config"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:98:b8:6a"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <target dev="tapca1c4039-2c"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/console.log" append="off"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:31:32 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:31:32 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:31:32 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:31:32 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.702 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Preparing to wait for external event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.702 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.702 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.702 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.703 187256 DEBUG nova.virt.libvirt.vif [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1949144222',display_name='tempest-TestNetworkAdvancedServerOps-server-1949144222',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1949144222',id=43,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNX0V1ZWqLkvjqknjHue7eMDjmU57Kyi77YE+oInmP6qNV/QL9X1483/QWccYbgRcBJxP6wOD3EZHA4fgPCGYTsYgxlD9MQiFEHV3r0YinxX/QpdEknYPmrXIJ/dQd9U0Q==',key_name='tempest-TestNetworkAdvancedServerOps-1862960466',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-kv5ts5xj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:31:28Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=88ea9985-1aae-41bc-b36b-f2cfcc70a818,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.703 187256 DEBUG nova.network.os_vif_util [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.704 187256 DEBUG nova.network.os_vif_util [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.704 187256 DEBUG os_vif [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.705 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.705 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.706 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.710 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.710 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca1c4039-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.710 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapca1c4039-2c, col_values=(('external_ids', {'iface-id': 'ca1c4039-2c03-41ff-ab95-67b86f6e0ee9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:b8:6a', 'vm-uuid': '88ea9985-1aae-41bc-b36b-f2cfcc70a818'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:31:32 np0005538960 NetworkManager[55548]: <info>  [1764347492.7137] manager: (tapca1c4039-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.712 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.716 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.721 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.722 187256 INFO os_vif [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c')#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.778 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.779 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.779 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] No VIF found with MAC fa:16:3e:98:b8:6a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:31:32 np0005538960 nova_compute[187252]: 2025-11-28 16:31:32.780 187256 INFO nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Using config drive#033[00m
Nov 28 11:31:33 np0005538960 nova_compute[187252]: 2025-11-28 16:31:33.369 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:33 np0005538960 nova_compute[187252]: 2025-11-28 16:31:33.428 187256 INFO nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Creating config drive at /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk.config#033[00m
Nov 28 11:31:33 np0005538960 nova_compute[187252]: 2025-11-28 16:31:33.433 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvqj0xoy9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:31:33 np0005538960 nova_compute[187252]: 2025-11-28 16:31:33.561 187256 DEBUG oslo_concurrency.processutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvqj0xoy9" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:31:33 np0005538960 kernel: tapca1c4039-2c: entered promiscuous mode
Nov 28 11:31:33 np0005538960 NetworkManager[55548]: <info>  [1764347493.6319] manager: (tapca1c4039-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Nov 28 11:31:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:33Z|00199|binding|INFO|Claiming lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for this chassis.
Nov 28 11:31:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:33Z|00200|binding|INFO|ca1c4039-2c03-41ff-ab95-67b86f6e0ee9: Claiming fa:16:3e:98:b8:6a 10.100.0.14
Nov 28 11:31:33 np0005538960 nova_compute[187252]: 2025-11-28 16:31:33.635 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:33 np0005538960 systemd-udevd[222150]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:31:33 np0005538960 systemd-machined[153518]: New machine qemu-16-instance-0000002b.
Nov 28 11:31:33 np0005538960 NetworkManager[55548]: <info>  [1764347493.6811] device (tapca1c4039-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:31:33 np0005538960 NetworkManager[55548]: <info>  [1764347493.6821] device (tapca1c4039-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.685 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:b8:6a 10.100.0.14'], port_security=['fa:16:3e:98:b8:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '88ea9985-1aae-41bc-b36b-f2cfcc70a818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3408f247-1ba1-4e41-821e-cda0531bb57d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a01dcc9-6649-4511-8cfb-c117ff260318, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.688 104369 INFO neutron.agent.ovn.metadata.agent [-] Port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 in datapath eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 bound to our chassis#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.689 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eb8f40f6-3754-4f6d-a4be-6f9f22f6f691#033[00m
Nov 28 11:31:33 np0005538960 nova_compute[187252]: 2025-11-28 16:31:33.689 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:33Z|00201|binding|INFO|Setting lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 ovn-installed in OVS
Nov 28 11:31:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:33Z|00202|binding|INFO|Setting lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 up in Southbound
Nov 28 11:31:33 np0005538960 nova_compute[187252]: 2025-11-28 16:31:33.697 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:33 np0005538960 systemd[1]: Started Virtual Machine qemu-16-instance-0000002b.
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.701 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6e71f02b-6380-4ccb-9fe8-09672fe943f5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.703 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeb8f40f6-31 in ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.706 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeb8f40f6-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.706 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7b837a-98ad-49e1-9454-40eb0933359e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.708 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[39e91460-f540-43fa-a0f6-7f9e0c730510]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.722 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[a3fd5a59-e882-4874-88d7-3b60796a2d9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.752 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[72106ccf-aac6-4f8c-b517-41cc15e3307a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.787 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[1884bd2e-c6d7-44d6-bb85-2f077136b265]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 NetworkManager[55548]: <info>  [1764347493.7965] manager: (tapeb8f40f6-30): new Veth device (/org/freedesktop/NetworkManager/Devices/107)
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.796 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e6805294-6f45-4dca-a9ac-9c321aa7569a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.842 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[d65e836c-84c4-456e-bc9f-0b9fd6129b6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.846 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a4baf1-e4da-489e-8da0-3647ffd7aa2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 NetworkManager[55548]: <info>  [1764347493.8733] device (tapeb8f40f6-30): carrier: link connected
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.882 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[e3595cf6-8bda-40d4-9317-980c0accda6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.902 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fbd2b373-8570-43a6-a400-27df1ce6a489]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb8f40f6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:08:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459345, 'reachable_time': 39628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222183, 'error': None, 'target': 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.926 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1f3fae77-eaae-460c-b8b9-f821822d714c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:8d0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 459345, 'tstamp': 459345}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222186, 'error': None, 'target': 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.944 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c900c9e8-42af-41e8-9f2d-b56bb0c188dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb8f40f6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:08:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 61], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459345, 'reachable_time': 39628, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222191, 'error': None, 'target': 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:33.973 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[0a651ada-14ff-4809-89e6-09c8d76ed0c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.019 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347494.0173922, 88ea9985-1aae-41bc-b36b-f2cfcc70a818 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.019 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] VM Started (Lifecycle Event)#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.036 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.039 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[73dfc0c1-372c-4cc0-9313-95aeacb981d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.040 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb8f40f6-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.041 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.041 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347494.0176876, 88ea9985-1aae-41bc-b36b-f2cfcc70a818 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.041 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb8f40f6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.041 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.043 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:34 np0005538960 NetworkManager[55548]: <info>  [1764347494.0443] manager: (tapeb8f40f6-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Nov 28 11:31:34 np0005538960 kernel: tapeb8f40f6-30: entered promiscuous mode
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.046 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.047 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeb8f40f6-30, col_values=(('external_ids', {'iface-id': 'e688a542-eaf2-403a-92da-c73d2f7f4a79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:31:34 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:34Z|00203|binding|INFO|Releasing lport e688a542-eaf2-403a-92da-c73d2f7f4a79 from this chassis (sb_readonly=0)
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.049 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb8f40f6-3754-4f6d-a4be-6f9f22f6f691.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb8f40f6-3754-4f6d-a4be-6f9f22f6f691.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.050 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d17c51db-6d56-4db0-8ad8-c15d219bf54f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.051 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/eb8f40f6-3754-4f6d-a4be-6f9f22f6f691.pid.haproxy
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID eb8f40f6-3754-4f6d-a4be-6f9f22f6f691
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:31:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:34.052 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'env', 'PROCESS_TAG=haproxy-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eb8f40f6-3754-4f6d-a4be-6f9f22f6f691.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.055 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.060 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.061 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.076 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.143 187256 DEBUG nova.network.neutron [req-ce6aa205-3f3c-40ad-9234-8a3565fa57d0 req-2f4cc8eb-758c-484f-874e-6f06ae3c2a11 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updated VIF entry in instance network info cache for port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.144 187256 DEBUG nova.network.neutron [req-ce6aa205-3f3c-40ad-9234-8a3565fa57d0 req-2f4cc8eb-758c-484f-874e-6f06ae3c2a11 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updating instance_info_cache with network_info: [{"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.157 187256 DEBUG oslo_concurrency.lockutils [req-ce6aa205-3f3c-40ad-9234-8a3565fa57d0 req-2f4cc8eb-758c-484f-874e-6f06ae3c2a11 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.208 187256 DEBUG nova.compute.manager [req-03cf1e38-457e-4d16-9727-f2c3dddf3f16 req-63544f81-1ba1-41f2-9984-0ba1b99c7ce6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.209 187256 DEBUG oslo_concurrency.lockutils [req-03cf1e38-457e-4d16-9727-f2c3dddf3f16 req-63544f81-1ba1-41f2-9984-0ba1b99c7ce6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.210 187256 DEBUG oslo_concurrency.lockutils [req-03cf1e38-457e-4d16-9727-f2c3dddf3f16 req-63544f81-1ba1-41f2-9984-0ba1b99c7ce6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.210 187256 DEBUG oslo_concurrency.lockutils [req-03cf1e38-457e-4d16-9727-f2c3dddf3f16 req-63544f81-1ba1-41f2-9984-0ba1b99c7ce6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.211 187256 DEBUG nova.compute.manager [req-03cf1e38-457e-4d16-9727-f2c3dddf3f16 req-63544f81-1ba1-41f2-9984-0ba1b99c7ce6 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Processing event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.212 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.215 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347494.2156885, 88ea9985-1aae-41bc-b36b-f2cfcc70a818 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.216 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.218 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.222 187256 INFO nova.virt.libvirt.driver [-] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Instance spawned successfully.#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.223 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.240 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.247 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.250 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.251 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.251 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.252 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.252 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.253 187256 DEBUG nova.virt.libvirt.driver [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.272 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.330 187256 INFO nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Took 5.98 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.331 187256 DEBUG nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:31:34 np0005538960 podman[222224]: 2025-11-28 16:31:34.47495726 +0000 UTC m=+0.062674821 container create ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 11:31:34 np0005538960 systemd[1]: Started libpod-conmon-ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb.scope.
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.523 187256 INFO nova.compute.manager [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Took 6.61 seconds to build instance.#033[00m
Nov 28 11:31:34 np0005538960 podman[222224]: 2025-11-28 16:31:34.435712845 +0000 UTC m=+0.023430436 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:31:34 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:31:34 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/059510e768e944ce6e50d23b28ba86f6ebc53d2ccb1cfb7d02a924ba24dde531/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:31:34 np0005538960 podman[222224]: 2025-11-28 16:31:34.561070075 +0000 UTC m=+0.148787656 container init ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:31:34 np0005538960 podman[222224]: 2025-11-28 16:31:34.569069878 +0000 UTC m=+0.156787439 container start ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 11:31:34 np0005538960 podman[222237]: 2025-11-28 16:31:34.594247465 +0000 UTC m=+0.081066565 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Nov 28 11:31:34 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222245]: [NOTICE]   (222262) : New worker (222265) forked
Nov 28 11:31:34 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222245]: [NOTICE]   (222262) : Loading success.
Nov 28 11:31:34 np0005538960 nova_compute[187252]: 2025-11-28 16:31:34.634 187256 DEBUG oslo_concurrency.lockutils [None req-ae355a5b-384b-4e52-9ebb-02d96b86fba7 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:36 np0005538960 nova_compute[187252]: 2025-11-28 16:31:36.383 187256 DEBUG nova.compute.manager [req-7e50cff9-dc80-4738-ab87-489b0dc587f2 req-0ec82821-b9e4-480e-a6b0-fb3f0b1cab84 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:31:36 np0005538960 nova_compute[187252]: 2025-11-28 16:31:36.384 187256 DEBUG oslo_concurrency.lockutils [req-7e50cff9-dc80-4738-ab87-489b0dc587f2 req-0ec82821-b9e4-480e-a6b0-fb3f0b1cab84 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:36 np0005538960 nova_compute[187252]: 2025-11-28 16:31:36.384 187256 DEBUG oslo_concurrency.lockutils [req-7e50cff9-dc80-4738-ab87-489b0dc587f2 req-0ec82821-b9e4-480e-a6b0-fb3f0b1cab84 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:36 np0005538960 nova_compute[187252]: 2025-11-28 16:31:36.384 187256 DEBUG oslo_concurrency.lockutils [req-7e50cff9-dc80-4738-ab87-489b0dc587f2 req-0ec82821-b9e4-480e-a6b0-fb3f0b1cab84 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:36 np0005538960 nova_compute[187252]: 2025-11-28 16:31:36.385 187256 DEBUG nova.compute.manager [req-7e50cff9-dc80-4738-ab87-489b0dc587f2 req-0ec82821-b9e4-480e-a6b0-fb3f0b1cab84 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] No waiting events found dispatching network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:31:36 np0005538960 nova_compute[187252]: 2025-11-28 16:31:36.385 187256 WARNING nova.compute.manager [req-7e50cff9-dc80-4738-ab87-489b0dc587f2 req-0ec82821-b9e4-480e-a6b0-fb3f0b1cab84 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received unexpected event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:31:37 np0005538960 nova_compute[187252]: 2025-11-28 16:31:37.713 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:38 np0005538960 podman[222276]: 2025-11-28 16:31:38.152928227 +0000 UTC m=+0.057251231 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:31:38 np0005538960 nova_compute[187252]: 2025-11-28 16:31:38.373 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:39 np0005538960 NetworkManager[55548]: <info>  [1764347499.8628] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Nov 28 11:31:39 np0005538960 NetworkManager[55548]: <info>  [1764347499.8638] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Nov 28 11:31:39 np0005538960 nova_compute[187252]: 2025-11-28 16:31:39.862 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:39 np0005538960 nova_compute[187252]: 2025-11-28 16:31:39.987 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:39 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:39Z|00204|binding|INFO|Releasing lport e688a542-eaf2-403a-92da-c73d2f7f4a79 from this chassis (sb_readonly=0)
Nov 28 11:31:40 np0005538960 nova_compute[187252]: 2025-11-28 16:31:40.013 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:42 np0005538960 nova_compute[187252]: 2025-11-28 16:31:42.717 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:43 np0005538960 nova_compute[187252]: 2025-11-28 16:31:43.375 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:45 np0005538960 podman[222299]: 2025-11-28 16:31:45.193207834 +0000 UTC m=+0.095312708 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:31:45 np0005538960 nova_compute[187252]: 2025-11-28 16:31:45.900 187256 DEBUG nova.compute.manager [req-a3896c25-338c-45a5-8aed-9a794e3174fb req-e1ec2537-595c-4a45-be89-906742930833 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-changed-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:31:45 np0005538960 nova_compute[187252]: 2025-11-28 16:31:45.900 187256 DEBUG nova.compute.manager [req-a3896c25-338c-45a5-8aed-9a794e3174fb req-e1ec2537-595c-4a45-be89-906742930833 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Refreshing instance network info cache due to event network-changed-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:31:45 np0005538960 nova_compute[187252]: 2025-11-28 16:31:45.900 187256 DEBUG oslo_concurrency.lockutils [req-a3896c25-338c-45a5-8aed-9a794e3174fb req-e1ec2537-595c-4a45-be89-906742930833 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:31:45 np0005538960 nova_compute[187252]: 2025-11-28 16:31:45.901 187256 DEBUG oslo_concurrency.lockutils [req-a3896c25-338c-45a5-8aed-9a794e3174fb req-e1ec2537-595c-4a45-be89-906742930833 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:31:45 np0005538960 nova_compute[187252]: 2025-11-28 16:31:45.901 187256 DEBUG nova.network.neutron [req-a3896c25-338c-45a5-8aed-9a794e3174fb req-e1ec2537-595c-4a45-be89-906742930833 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Refreshing network info cache for port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:31:47 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:47Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:98:b8:6a 10.100.0.14
Nov 28 11:31:47 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:47Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:98:b8:6a 10.100.0.14
Nov 28 11:31:47 np0005538960 nova_compute[187252]: 2025-11-28 16:31:47.719 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:48 np0005538960 podman[222336]: 2025-11-28 16:31:48.151336732 +0000 UTC m=+0.052682711 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, tcib_managed=true)
Nov 28 11:31:48 np0005538960 podman[222335]: 2025-11-28 16:31:48.160371889 +0000 UTC m=+0.067134269 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:31:48 np0005538960 nova_compute[187252]: 2025-11-28 16:31:48.377 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:49 np0005538960 nova_compute[187252]: 2025-11-28 16:31:49.503 187256 DEBUG nova.network.neutron [req-a3896c25-338c-45a5-8aed-9a794e3174fb req-e1ec2537-595c-4a45-be89-906742930833 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updated VIF entry in instance network info cache for port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:31:49 np0005538960 nova_compute[187252]: 2025-11-28 16:31:49.504 187256 DEBUG nova.network.neutron [req-a3896c25-338c-45a5-8aed-9a794e3174fb req-e1ec2537-595c-4a45-be89-906742930833 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updating instance_info_cache with network_info: [{"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:31:49 np0005538960 nova_compute[187252]: 2025-11-28 16:31:49.822 187256 DEBUG oslo_concurrency.lockutils [req-a3896c25-338c-45a5-8aed-9a794e3174fb req-e1ec2537-595c-4a45-be89-906742930833 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:31:52 np0005538960 nova_compute[187252]: 2025-11-28 16:31:52.495 187256 INFO nova.compute.manager [None req-065de72b-a45f-4124-b7d5-28d119dad39f 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Get console output#033[00m
Nov 28 11:31:52 np0005538960 nova_compute[187252]: 2025-11-28 16:31:52.500 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:31:52 np0005538960 nova_compute[187252]: 2025-11-28 16:31:52.722 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:52 np0005538960 nova_compute[187252]: 2025-11-28 16:31:52.995 187256 DEBUG nova.objects.instance [None req-13ae8a7a-9dbd-4b6f-a372-3f1ac4808fb2 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'pci_devices' on Instance uuid 88ea9985-1aae-41bc-b36b-f2cfcc70a818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:31:53 np0005538960 nova_compute[187252]: 2025-11-28 16:31:53.016 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347513.0157802, 88ea9985-1aae-41bc-b36b-f2cfcc70a818 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:31:53 np0005538960 nova_compute[187252]: 2025-11-28 16:31:53.016 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:31:53 np0005538960 nova_compute[187252]: 2025-11-28 16:31:53.032 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:31:53 np0005538960 nova_compute[187252]: 2025-11-28 16:31:53.037 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:31:53 np0005538960 nova_compute[187252]: 2025-11-28 16:31:53.057 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Nov 28 11:31:53 np0005538960 podman[222374]: 2025-11-28 16:31:53.154992869 +0000 UTC m=+0.055252822 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:31:53 np0005538960 nova_compute[187252]: 2025-11-28 16:31:53.405 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:54 np0005538960 kernel: tapca1c4039-2c (unregistering): left promiscuous mode
Nov 28 11:31:54 np0005538960 NetworkManager[55548]: <info>  [1764347514.2136] device (tapca1c4039-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:31:54 np0005538960 nova_compute[187252]: 2025-11-28 16:31:54.225 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:54Z|00205|binding|INFO|Releasing lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 from this chassis (sb_readonly=0)
Nov 28 11:31:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:54Z|00206|binding|INFO|Setting lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 down in Southbound
Nov 28 11:31:54 np0005538960 ovn_controller[95460]: 2025-11-28T16:31:54Z|00207|binding|INFO|Removing iface tapca1c4039-2c ovn-installed in OVS
Nov 28 11:31:54 np0005538960 nova_compute[187252]: 2025-11-28 16:31:54.227 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:54 np0005538960 nova_compute[187252]: 2025-11-28 16:31:54.243 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:54 np0005538960 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Nov 28 11:31:54 np0005538960 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000002b.scope: Consumed 13.788s CPU time.
Nov 28 11:31:54 np0005538960 systemd-machined[153518]: Machine qemu-16-instance-0000002b terminated.
Nov 28 11:31:54 np0005538960 nova_compute[187252]: 2025-11-28 16:31:54.468 187256 DEBUG nova.compute.manager [None req-13ae8a7a-9dbd-4b6f-a372-3f1ac4808fb2 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.479 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:b8:6a 10.100.0.14'], port_security=['fa:16:3e:98:b8:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '88ea9985-1aae-41bc-b36b-f2cfcc70a818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3408f247-1ba1-4e41-821e-cda0531bb57d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a01dcc9-6649-4511-8cfb-c117ff260318, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.481 104369 INFO neutron.agent.ovn.metadata.agent [-] Port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 in datapath eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 unbound from our chassis#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.482 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.484 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[19c6fa1f-ca58-4705-9531-843dad590098]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.486 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 namespace which is not needed anymore#033[00m
Nov 28 11:31:54 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222245]: [NOTICE]   (222262) : haproxy version is 2.8.14-c23fe91
Nov 28 11:31:54 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222245]: [NOTICE]   (222262) : path to executable is /usr/sbin/haproxy
Nov 28 11:31:54 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222245]: [WARNING]  (222262) : Exiting Master process...
Nov 28 11:31:54 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222245]: [ALERT]    (222262) : Current worker (222265) exited with code 143 (Terminated)
Nov 28 11:31:54 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222245]: [WARNING]  (222262) : All workers exited. Exiting... (0)
Nov 28 11:31:54 np0005538960 systemd[1]: libpod-ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb.scope: Deactivated successfully.
Nov 28 11:31:54 np0005538960 conmon[222245]: conmon ac1215eb1fb7774c81af <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb.scope/container/memory.events
Nov 28 11:31:54 np0005538960 podman[222438]: 2025-11-28 16:31:54.628390812 +0000 UTC m=+0.046240216 container died ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:31:54 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb-userdata-shm.mount: Deactivated successfully.
Nov 28 11:31:54 np0005538960 systemd[1]: var-lib-containers-storage-overlay-059510e768e944ce6e50d23b28ba86f6ebc53d2ccb1cfb7d02a924ba24dde531-merged.mount: Deactivated successfully.
Nov 28 11:31:54 np0005538960 podman[222438]: 2025-11-28 16:31:54.669664656 +0000 UTC m=+0.087514050 container cleanup ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:31:54 np0005538960 systemd[1]: libpod-conmon-ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb.scope: Deactivated successfully.
Nov 28 11:31:54 np0005538960 podman[222467]: 2025-11-28 16:31:54.736060647 +0000 UTC m=+0.044810382 container remove ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.743 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[508dc248-7812-42fa-9263-a3896cdb8068]: (4, ('Fri Nov 28 04:31:54 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 (ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb)\nac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb\nFri Nov 28 04:31:54 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 (ac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb)\nac1215eb1fb7774c81af5c85a050bcb8368196c060d57af70b2b7ed3106e69bb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.746 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c55f698d-e063-4d6f-a3c9-6cc447e7a5f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.747 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb8f40f6-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:31:54 np0005538960 nova_compute[187252]: 2025-11-28 16:31:54.751 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:54 np0005538960 kernel: tapeb8f40f6-30: left promiscuous mode
Nov 28 11:31:54 np0005538960 nova_compute[187252]: 2025-11-28 16:31:54.770 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.776 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[30259624-7c48-43bb-9abf-9e771c94db5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.792 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2212cd61-7e70-48d2-b445-2f67e1733d74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.794 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ef395e41-fe60-4e8a-be8f-43ebceed5e82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.810 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce6c987-9112-475b-9617-836b6dff030a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 459336, 'reachable_time': 40060, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222486, 'error': None, 'target': 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.812 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:31:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:31:54.812 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[41efebe4-073c-4e12-88f6-2be780906780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:31:54 np0005538960 systemd[1]: run-netns-ovnmeta\x2deb8f40f6\x2d3754\x2d4f6d\x2da4be\x2d6f9f22f6f691.mount: Deactivated successfully.
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.161 187256 DEBUG nova.compute.manager [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-unplugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.162 187256 DEBUG oslo_concurrency.lockutils [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.162 187256 DEBUG oslo_concurrency.lockutils [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.162 187256 DEBUG oslo_concurrency.lockutils [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.162 187256 DEBUG nova.compute.manager [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] No waiting events found dispatching network-vif-unplugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.163 187256 WARNING nova.compute.manager [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received unexpected event network-vif-unplugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for instance with vm_state suspended and task_state None.#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.163 187256 DEBUG nova.compute.manager [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.163 187256 DEBUG oslo_concurrency.lockutils [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.163 187256 DEBUG oslo_concurrency.lockutils [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.164 187256 DEBUG oslo_concurrency.lockutils [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.164 187256 DEBUG nova.compute.manager [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] No waiting events found dispatching network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:31:56 np0005538960 nova_compute[187252]: 2025-11-28 16:31:56.164 187256 WARNING nova.compute.manager [req-545b59d1-de07-43fb-ac29-8dade732c1d4 req-99f3d6ed-d523-4ee6-ae63-bc3c4cea0348 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received unexpected event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for instance with vm_state suspended and task_state None.#033[00m
Nov 28 11:31:57 np0005538960 podman[222487]: 2025-11-28 16:31:57.181453645 +0000 UTC m=+0.088705178 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, distribution-scope=public, config_id=edpm, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 11:31:57 np0005538960 nova_compute[187252]: 2025-11-28 16:31:57.724 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:31:58 np0005538960 nova_compute[187252]: 2025-11-28 16:31:58.451 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:00 np0005538960 nova_compute[187252]: 2025-11-28 16:32:00.849 187256 INFO nova.compute.manager [None req-8753a22e-d9fe-41d0-9a3c-afe1f41fa492 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Get console output#033[00m
Nov 28 11:32:01 np0005538960 nova_compute[187252]: 2025-11-28 16:32:01.168 187256 INFO nova.compute.manager [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Resuming#033[00m
Nov 28 11:32:01 np0005538960 nova_compute[187252]: 2025-11-28 16:32:01.169 187256 DEBUG nova.objects.instance [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'flavor' on Instance uuid 88ea9985-1aae-41bc-b36b-f2cfcc70a818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:32:01 np0005538960 nova_compute[187252]: 2025-11-28 16:32:01.219 187256 DEBUG oslo_concurrency.lockutils [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:32:01 np0005538960 nova_compute[187252]: 2025-11-28 16:32:01.219 187256 DEBUG oslo_concurrency.lockutils [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquired lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:32:01 np0005538960 nova_compute[187252]: 2025-11-28 16:32:01.219 187256 DEBUG nova.network.neutron [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:32:02 np0005538960 nova_compute[187252]: 2025-11-28 16:32:02.727 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:03 np0005538960 nova_compute[187252]: 2025-11-28 16:32:03.453 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.698 187256 DEBUG nova.network.neutron [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updating instance_info_cache with network_info: [{"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.869 187256 DEBUG oslo_concurrency.lockutils [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Releasing lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.875 187256 DEBUG nova.virt.libvirt.vif [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1949144222',display_name='tempest-TestNetworkAdvancedServerOps-server-1949144222',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1949144222',id=43,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNX0V1ZWqLkvjqknjHue7eMDjmU57Kyi77YE+oInmP6qNV/QL9X1483/QWccYbgRcBJxP6wOD3EZHA4fgPCGYTsYgxlD9MQiFEHV3r0YinxX/QpdEknYPmrXIJ/dQd9U0Q==',key_name='tempest-TestNetworkAdvancedServerOps-1862960466',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:31:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-kv5ts5xj',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:31:54Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=88ea9985-1aae-41bc-b36b-f2cfcc70a818,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.876 187256 DEBUG nova.network.os_vif_util [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.876 187256 DEBUG nova.network.os_vif_util [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.877 187256 DEBUG os_vif [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.877 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.878 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.878 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.881 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.882 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapca1c4039-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.882 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapca1c4039-2c, col_values=(('external_ids', {'iface-id': 'ca1c4039-2c03-41ff-ab95-67b86f6e0ee9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:98:b8:6a', 'vm-uuid': '88ea9985-1aae-41bc-b36b-f2cfcc70a818'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.883 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:32:04 np0005538960 nova_compute[187252]: 2025-11-28 16:32:04.883 187256 INFO os_vif [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c')#033[00m
Nov 28 11:32:04 np0005538960 podman[222509]: 2025-11-28 16:32:04.972770894 +0000 UTC m=+0.057449636 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125)
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.297 187256 DEBUG nova.objects.instance [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'numa_topology' on Instance uuid 88ea9985-1aae-41bc-b36b-f2cfcc70a818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:32:05 np0005538960 kernel: tapca1c4039-2c: entered promiscuous mode
Nov 28 11:32:05 np0005538960 ovn_controller[95460]: 2025-11-28T16:32:05Z|00208|binding|INFO|Claiming lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for this chassis.
Nov 28 11:32:05 np0005538960 ovn_controller[95460]: 2025-11-28T16:32:05Z|00209|binding|INFO|ca1c4039-2c03-41ff-ab95-67b86f6e0ee9: Claiming fa:16:3e:98:b8:6a 10.100.0.14
Nov 28 11:32:05 np0005538960 NetworkManager[55548]: <info>  [1764347525.3734] manager: (tapca1c4039-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.373 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:05 np0005538960 ovn_controller[95460]: 2025-11-28T16:32:05Z|00210|binding|INFO|Setting lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 ovn-installed in OVS
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.390 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:b8:6a 10.100.0.14'], port_security=['fa:16:3e:98:b8:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '88ea9985-1aae-41bc-b36b-f2cfcc70a818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '5', 'neutron:security_group_ids': '3408f247-1ba1-4e41-821e-cda0531bb57d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.214'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a01dcc9-6649-4511-8cfb-c117ff260318, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:32:05 np0005538960 ovn_controller[95460]: 2025-11-28T16:32:05Z|00211|binding|INFO|Setting lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 up in Southbound
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.392 104369 INFO neutron.agent.ovn.metadata.agent [-] Port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 in datapath eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 bound to our chassis#033[00m
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.392 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.393 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eb8f40f6-3754-4f6d-a4be-6f9f22f6f691#033[00m
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.396 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:05 np0005538960 systemd-udevd[222544]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.407 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2140a78f-298f-416c-9cd0-b0b4adef0a94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.408 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeb8f40f6-31 in ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.409 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeb8f40f6-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.410 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[6a21a794-17c2-40a9-810b-36eb02e57ab6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.411 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f96bb6a0-1c4b-4ee7-8129-63130b0f13cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 NetworkManager[55548]: <info>  [1764347525.4166] device (tapca1c4039-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:32:05 np0005538960 NetworkManager[55548]: <info>  [1764347525.4178] device (tapca1c4039-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.422 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[046993ec-99fc-4d43-b381-ea969bc52be1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 systemd-machined[153518]: New machine qemu-17-instance-0000002b.
Nov 28 11:32:05 np0005538960 systemd[1]: Started Virtual Machine qemu-17-instance-0000002b.
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.446 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[aa556c13-21cf-4266-b5fa-f0124ff18b5d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.476 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[dd9aad92-533a-4976-881a-7c70cf0a041e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 NetworkManager[55548]: <info>  [1764347525.4831] manager: (tapeb8f40f6-30): new Veth device (/org/freedesktop/NetworkManager/Devices/112)
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.482 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[383516b3-c10b-4fb6-8dbd-10ebde6fc9a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 systemd-udevd[222550]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.516 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[0adea33c-2b49-4035-a0f4-6f2aa545b93e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.519 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[e62f1947-6148-4114-8bf2-e89398077123]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 NetworkManager[55548]: <info>  [1764347525.5440] device (tapeb8f40f6-30): carrier: link connected
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.549 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[acc7dc2d-2382-4161-a139-33e5bb240dff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.567 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[18a77a3c-a60c-4fce-9b2d-ac118bd4bce5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb8f40f6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:08:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462512, 'reachable_time': 30083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222580, 'error': None, 'target': 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.584 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[791b4a04-47fc-411c-b58c-fbe38132f1fd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:8d0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 462512, 'tstamp': 462512}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222581, 'error': None, 'target': 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.603 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[93a52184-d149-4b9c-a17a-4cab90078cb5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb8f40f6-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:08:d0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462512, 'reachable_time': 30083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222582, 'error': None, 'target': 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.641 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2761c912-1a19-4f67-85c6-4a1977ea74cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.707 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b439b66b-baa1-4bef-b20c-8e818284364c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.710 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb8f40f6-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.710 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.711 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb8f40f6-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.714 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:05 np0005538960 NetworkManager[55548]: <info>  [1764347525.7152] manager: (tapeb8f40f6-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/113)
Nov 28 11:32:05 np0005538960 kernel: tapeb8f40f6-30: entered promiscuous mode
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.717 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.721 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeb8f40f6-30, col_values=(('external_ids', {'iface-id': 'e688a542-eaf2-403a-92da-c73d2f7f4a79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.723 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:05 np0005538960 ovn_controller[95460]: 2025-11-28T16:32:05Z|00212|binding|INFO|Releasing lport e688a542-eaf2-403a-92da-c73d2f7f4a79 from this chassis (sb_readonly=0)
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.723 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.725 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb8f40f6-3754-4f6d-a4be-6f9f22f6f691.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb8f40f6-3754-4f6d-a4be-6f9f22f6f691.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.726 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c7b2629d-bb45-442c-bcf3-7edbef7d2c04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.727 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/eb8f40f6-3754-4f6d-a4be-6f9f22f6f691.pid.haproxy
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID eb8f40f6-3754-4f6d-a4be-6f9f22f6f691
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:32:05 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:05.728 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'env', 'PROCESS_TAG=haproxy-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eb8f40f6-3754-4f6d-a4be-6f9f22f6f691.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.734 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.970 187256 DEBUG nova.virt.libvirt.host [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Removed pending event for 88ea9985-1aae-41bc-b36b-f2cfcc70a818 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.970 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347525.9698172, 88ea9985-1aae-41bc-b36b-f2cfcc70a818 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:32:05 np0005538960 nova_compute[187252]: 2025-11-28 16:32:05.971 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] VM Started (Lifecycle Event)#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.006 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:32:06 np0005538960 podman[222621]: 2025-11-28 16:32:06.095690388 +0000 UTC m=+0.028934008 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:32:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:06.351 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:06.352 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:06.352 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.362 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347526.361523, 88ea9985-1aae-41bc-b36b-f2cfcc70a818 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.363 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.381 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.386 187256 DEBUG nova.compute.manager [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.387 187256 DEBUG nova.objects.instance [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'pci_devices' on Instance uuid 88ea9985-1aae-41bc-b36b-f2cfcc70a818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.391 187256 DEBUG nova.compute.manager [req-e63df8bc-6c26-4c9b-9350-c793bbdd30d9 req-f2202c1e-f18a-4e59-b18f-26e6b862691f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.392 187256 DEBUG oslo_concurrency.lockutils [req-e63df8bc-6c26-4c9b-9350-c793bbdd30d9 req-f2202c1e-f18a-4e59-b18f-26e6b862691f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.392 187256 DEBUG oslo_concurrency.lockutils [req-e63df8bc-6c26-4c9b-9350-c793bbdd30d9 req-f2202c1e-f18a-4e59-b18f-26e6b862691f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.393 187256 DEBUG oslo_concurrency.lockutils [req-e63df8bc-6c26-4c9b-9350-c793bbdd30d9 req-f2202c1e-f18a-4e59-b18f-26e6b862691f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.393 187256 DEBUG nova.compute.manager [req-e63df8bc-6c26-4c9b-9350-c793bbdd30d9 req-f2202c1e-f18a-4e59-b18f-26e6b862691f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] No waiting events found dispatching network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.393 187256 WARNING nova.compute.manager [req-e63df8bc-6c26-4c9b-9350-c793bbdd30d9 req-f2202c1e-f18a-4e59-b18f-26e6b862691f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received unexpected event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for instance with vm_state suspended and task_state resuming.#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.394 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.413 187256 INFO nova.virt.libvirt.driver [-] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Instance running successfully.#033[00m
Nov 28 11:32:06 np0005538960 virtqemud[186797]: argument unsupported: QEMU guest agent is not configured
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.417 187256 DEBUG nova.virt.libvirt.guest [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.417 187256 DEBUG nova.compute.manager [None req-0c169fa3-e434-45e3-9ad9-7cffac5e1666 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:32:06 np0005538960 podman[222621]: 2025-11-28 16:32:06.420559868 +0000 UTC m=+0.353803478 container create c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 11:32:06 np0005538960 nova_compute[187252]: 2025-11-28 16:32:06.446 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Nov 28 11:32:06 np0005538960 systemd[1]: Started libpod-conmon-c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1.scope.
Nov 28 11:32:06 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:32:06 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5faf01cf67163b63655fa02be0c811e28d472737b110c5eaf7f19907ceba0a6a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:32:06 np0005538960 podman[222621]: 2025-11-28 16:32:06.516465959 +0000 UTC m=+0.449709559 container init c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 11:32:06 np0005538960 podman[222621]: 2025-11-28 16:32:06.523688393 +0000 UTC m=+0.456931983 container start c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, io.buildah.version=1.41.3)
Nov 28 11:32:06 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222636]: [NOTICE]   (222640) : New worker (222642) forked
Nov 28 11:32:06 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222636]: [NOTICE]   (222640) : Loading success.
Nov 28 11:32:07 np0005538960 nova_compute[187252]: 2025-11-28 16:32:07.732 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.094 187256 INFO nova.compute.manager [None req-12aea9a8-4ad9-46c4-8de4-8c9b1adedb75 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Get console output#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.099 214150 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.456 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.556 187256 DEBUG nova.compute.manager [req-a2a44853-9555-4b22-b450-501e962ffe50 req-9bffb586-9dcb-4549-8076-f64128438e94 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.557 187256 DEBUG oslo_concurrency.lockutils [req-a2a44853-9555-4b22-b450-501e962ffe50 req-9bffb586-9dcb-4549-8076-f64128438e94 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.558 187256 DEBUG oslo_concurrency.lockutils [req-a2a44853-9555-4b22-b450-501e962ffe50 req-9bffb586-9dcb-4549-8076-f64128438e94 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.558 187256 DEBUG oslo_concurrency.lockutils [req-a2a44853-9555-4b22-b450-501e962ffe50 req-9bffb586-9dcb-4549-8076-f64128438e94 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.558 187256 DEBUG nova.compute.manager [req-a2a44853-9555-4b22-b450-501e962ffe50 req-9bffb586-9dcb-4549-8076-f64128438e94 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] No waiting events found dispatching network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:32:08 np0005538960 nova_compute[187252]: 2025-11-28 16:32:08.559 187256 WARNING nova.compute.manager [req-a2a44853-9555-4b22-b450-501e962ffe50 req-9bffb586-9dcb-4549-8076-f64128438e94 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received unexpected event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.078 187256 DEBUG nova.compute.manager [req-794ca80c-5799-4bfd-b169-6362d6d87e2a req-bac79824-8727-404f-84d8-bbe93977068d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-changed-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.078 187256 DEBUG nova.compute.manager [req-794ca80c-5799-4bfd-b169-6362d6d87e2a req-bac79824-8727-404f-84d8-bbe93977068d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Refreshing instance network info cache due to event network-changed-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.079 187256 DEBUG oslo_concurrency.lockutils [req-794ca80c-5799-4bfd-b169-6362d6d87e2a req-bac79824-8727-404f-84d8-bbe93977068d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.079 187256 DEBUG oslo_concurrency.lockutils [req-794ca80c-5799-4bfd-b169-6362d6d87e2a req-bac79824-8727-404f-84d8-bbe93977068d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.079 187256 DEBUG nova.network.neutron [req-794ca80c-5799-4bfd-b169-6362d6d87e2a req-bac79824-8727-404f-84d8-bbe93977068d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Refreshing network info cache for port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:32:09 np0005538960 podman[222651]: 2025-11-28 16:32:09.210102742 +0000 UTC m=+0.105404911 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.301 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.301 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.302 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.302 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.302 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.303 187256 INFO nova.compute.manager [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Terminating instance#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.304 187256 DEBUG nova.compute.manager [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:32:09 np0005538960 kernel: tapca1c4039-2c (unregistering): left promiscuous mode
Nov 28 11:32:09 np0005538960 NetworkManager[55548]: <info>  [1764347529.3262] device (tapca1c4039-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.332 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:09 np0005538960 ovn_controller[95460]: 2025-11-28T16:32:09Z|00213|binding|INFO|Releasing lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 from this chassis (sb_readonly=0)
Nov 28 11:32:09 np0005538960 ovn_controller[95460]: 2025-11-28T16:32:09Z|00214|binding|INFO|Setting lport ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 down in Southbound
Nov 28 11:32:09 np0005538960 ovn_controller[95460]: 2025-11-28T16:32:09Z|00215|binding|INFO|Removing iface tapca1c4039-2c ovn-installed in OVS
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.335 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.349 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.363 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:98:b8:6a 10.100.0.14'], port_security=['fa:16:3e:98:b8:6a 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '88ea9985-1aae-41bc-b36b-f2cfcc70a818', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e408bace48b41a1ac0677d300b6d288', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3408f247-1ba1-4e41-821e-cda0531bb57d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7a01dcc9-6649-4511-8cfb-c117ff260318, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.364 104369 INFO neutron.agent.ovn.metadata.agent [-] Port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 in datapath eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 unbound from our chassis#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.366 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.367 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5e944028-afe3-491e-a028-8cda4438f1de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.368 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 namespace which is not needed anymore#033[00m
Nov 28 11:32:09 np0005538960 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Nov 28 11:32:09 np0005538960 systemd-machined[153518]: Machine qemu-17-instance-0000002b terminated.
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.550 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.596 187256 INFO nova.virt.libvirt.driver [-] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Instance destroyed successfully.#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.597 187256 DEBUG nova.objects.instance [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lazy-loading 'resources' on Instance uuid 88ea9985-1aae-41bc-b36b-f2cfcc70a818 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.626 187256 DEBUG nova.virt.libvirt.vif [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:31:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1949144222',display_name='tempest-TestNetworkAdvancedServerOps-server-1949144222',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1949144222',id=43,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNX0V1ZWqLkvjqknjHue7eMDjmU57Kyi77YE+oInmP6qNV/QL9X1483/QWccYbgRcBJxP6wOD3EZHA4fgPCGYTsYgxlD9MQiFEHV3r0YinxX/QpdEknYPmrXIJ/dQd9U0Q==',key_name='tempest-TestNetworkAdvancedServerOps-1862960466',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:31:34Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7e408bace48b41a1ac0677d300b6d288',ramdisk_id='',reservation_id='r-kv5ts5xj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-762685809',owner_user_name='tempest-TestNetworkAdvancedServerOps-762685809-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:32:06Z,user_data=None,user_id='5d381eba17324dd5ad798648b82d0115',uuid=88ea9985-1aae-41bc-b36b-f2cfcc70a818,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.626 187256 DEBUG nova.network.os_vif_util [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converting VIF {"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.214", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.627 187256 DEBUG nova.network.os_vif_util [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.627 187256 DEBUG os_vif [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.630 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.630 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapca1c4039-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.633 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.635 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.638 187256 INFO os_vif [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:98:b8:6a,bridge_name='br-int',has_traffic_filtering=True,id=ca1c4039-2c03-41ff-ab95-67b86f6e0ee9,network=Network(eb8f40f6-3754-4f6d-a4be-6f9f22f6f691),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapca1c4039-2c')#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.638 187256 INFO nova.virt.libvirt.driver [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Deleting instance files /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818_del#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.639 187256 INFO nova.virt.libvirt.driver [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Deletion of /var/lib/nova/instances/88ea9985-1aae-41bc-b36b-f2cfcc70a818_del complete#033[00m
Nov 28 11:32:09 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222636]: [NOTICE]   (222640) : haproxy version is 2.8.14-c23fe91
Nov 28 11:32:09 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222636]: [NOTICE]   (222640) : path to executable is /usr/sbin/haproxy
Nov 28 11:32:09 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222636]: [WARNING]  (222640) : Exiting Master process...
Nov 28 11:32:09 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222636]: [WARNING]  (222640) : Exiting Master process...
Nov 28 11:32:09 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222636]: [ALERT]    (222640) : Current worker (222642) exited with code 143 (Terminated)
Nov 28 11:32:09 np0005538960 neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691[222636]: [WARNING]  (222640) : All workers exited. Exiting... (0)
Nov 28 11:32:09 np0005538960 systemd[1]: libpod-c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1.scope: Deactivated successfully.
Nov 28 11:32:09 np0005538960 podman[222697]: 2025-11-28 16:32:09.682344174 +0000 UTC m=+0.207755928 container died c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 11:32:09 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1-userdata-shm.mount: Deactivated successfully.
Nov 28 11:32:09 np0005538960 systemd[1]: var-lib-containers-storage-overlay-5faf01cf67163b63655fa02be0c811e28d472737b110c5eaf7f19907ceba0a6a-merged.mount: Deactivated successfully.
Nov 28 11:32:09 np0005538960 podman[222697]: 2025-11-28 16:32:09.723803134 +0000 UTC m=+0.249214888 container cleanup c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 11:32:09 np0005538960 systemd[1]: libpod-conmon-c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1.scope: Deactivated successfully.
Nov 28 11:32:09 np0005538960 podman[222745]: 2025-11-28 16:32:09.959680048 +0000 UTC m=+0.211310293 container remove c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.965 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9c45550e-2b09-40f3-ac30-91bd82dcda1f]: (4, ('Fri Nov 28 04:32:09 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 (c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1)\nc669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1\nFri Nov 28 04:32:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 (c669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1)\nc669652dee19f66160a292723484db673173d8f03f12898ad93efdbaa8e6bda1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.967 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[04f7a2a9-58cf-4dc3-b185-2c82946d083a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.968 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb8f40f6-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.970 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:09 np0005538960 kernel: tapeb8f40f6-30: left promiscuous mode
Nov 28 11:32:09 np0005538960 nova_compute[187252]: 2025-11-28 16:32:09.982 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.984 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[c12213e9-8df1-4976-a4a9-441bdcd04751]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.997 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[4a8b312b-61d2-4a03-8b65-16fa358dff91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:09 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:09.998 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fc21e992-c8dd-44d0-8034-4dd72b4e244c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:10 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:10.012 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3b7b5e66-c96d-4056-ba5d-9881e3864a03]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 462505, 'reachable_time': 22259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222761, 'error': None, 'target': 'ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:10 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:10.015 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eb8f40f6-3754-4f6d-a4be-6f9f22f6f691 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:32:10 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:10.015 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa4be64-73ef-46a1-9266-3e08f7e6d8d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:32:10 np0005538960 systemd[1]: run-netns-ovnmeta\x2deb8f40f6\x2d3754\x2d4f6d\x2da4be\x2d6f9f22f6f691.mount: Deactivated successfully.
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.071 187256 INFO nova.compute.manager [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.072 187256 DEBUG oslo.service.loopingcall [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.073 187256 DEBUG nova.compute.manager [-] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.073 187256 DEBUG nova.network.neutron [-] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.642 187256 DEBUG nova.compute.manager [req-eef4ca8b-43a3-4c54-8112-5d831404b349 req-b5ad1740-225a-424b-a27d-73cbb5edb262 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-unplugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.643 187256 DEBUG oslo_concurrency.lockutils [req-eef4ca8b-43a3-4c54-8112-5d831404b349 req-b5ad1740-225a-424b-a27d-73cbb5edb262 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.643 187256 DEBUG oslo_concurrency.lockutils [req-eef4ca8b-43a3-4c54-8112-5d831404b349 req-b5ad1740-225a-424b-a27d-73cbb5edb262 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.643 187256 DEBUG oslo_concurrency.lockutils [req-eef4ca8b-43a3-4c54-8112-5d831404b349 req-b5ad1740-225a-424b-a27d-73cbb5edb262 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.643 187256 DEBUG nova.compute.manager [req-eef4ca8b-43a3-4c54-8112-5d831404b349 req-b5ad1740-225a-424b-a27d-73cbb5edb262 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] No waiting events found dispatching network-vif-unplugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:32:10 np0005538960 nova_compute[187252]: 2025-11-28 16:32:10.644 187256 DEBUG nova.compute.manager [req-eef4ca8b-43a3-4c54-8112-5d831404b349 req-b5ad1740-225a-424b-a27d-73cbb5edb262 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-unplugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:32:13 np0005538960 nova_compute[187252]: 2025-11-28 16:32:13.458 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:13 np0005538960 nova_compute[187252]: 2025-11-28 16:32:13.868 187256 DEBUG nova.compute.manager [req-1fb55b04-58a9-48f4-ab4b-32826efca51d req-55b49faa-6c1d-42a0-92b0-355ca330d372 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:32:13 np0005538960 nova_compute[187252]: 2025-11-28 16:32:13.868 187256 DEBUG oslo_concurrency.lockutils [req-1fb55b04-58a9-48f4-ab4b-32826efca51d req-55b49faa-6c1d-42a0-92b0-355ca330d372 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:13 np0005538960 nova_compute[187252]: 2025-11-28 16:32:13.868 187256 DEBUG oslo_concurrency.lockutils [req-1fb55b04-58a9-48f4-ab4b-32826efca51d req-55b49faa-6c1d-42a0-92b0-355ca330d372 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:13 np0005538960 nova_compute[187252]: 2025-11-28 16:32:13.869 187256 DEBUG oslo_concurrency.lockutils [req-1fb55b04-58a9-48f4-ab4b-32826efca51d req-55b49faa-6c1d-42a0-92b0-355ca330d372 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:13 np0005538960 nova_compute[187252]: 2025-11-28 16:32:13.869 187256 DEBUG nova.compute.manager [req-1fb55b04-58a9-48f4-ab4b-32826efca51d req-55b49faa-6c1d-42a0-92b0-355ca330d372 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] No waiting events found dispatching network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:32:13 np0005538960 nova_compute[187252]: 2025-11-28 16:32:13.869 187256 WARNING nova.compute.manager [req-1fb55b04-58a9-48f4-ab4b-32826efca51d req-55b49faa-6c1d-42a0-92b0-355ca330d372 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received unexpected event network-vif-plugged-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 11:32:14 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:14.073 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:32:14 np0005538960 nova_compute[187252]: 2025-11-28 16:32:14.074 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:14 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:14.074 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:32:14 np0005538960 nova_compute[187252]: 2025-11-28 16:32:14.230 187256 DEBUG nova.network.neutron [req-794ca80c-5799-4bfd-b169-6362d6d87e2a req-bac79824-8727-404f-84d8-bbe93977068d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updated VIF entry in instance network info cache for port ca1c4039-2c03-41ff-ab95-67b86f6e0ee9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:32:14 np0005538960 nova_compute[187252]: 2025-11-28 16:32:14.231 187256 DEBUG nova.network.neutron [req-794ca80c-5799-4bfd-b169-6362d6d87e2a req-bac79824-8727-404f-84d8-bbe93977068d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updating instance_info_cache with network_info: [{"id": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "address": "fa:16:3e:98:b8:6a", "network": {"id": "eb8f40f6-3754-4f6d-a4be-6f9f22f6f691", "bridge": "br-int", "label": "tempest-network-smoke--802007168", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7e408bace48b41a1ac0677d300b6d288", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapca1c4039-2c", "ovs_interfaceid": "ca1c4039-2c03-41ff-ab95-67b86f6e0ee9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:32:14 np0005538960 nova_compute[187252]: 2025-11-28 16:32:14.312 187256 DEBUG oslo_concurrency.lockutils [req-794ca80c-5799-4bfd-b169-6362d6d87e2a req-bac79824-8727-404f-84d8-bbe93977068d 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-88ea9985-1aae-41bc-b36b-f2cfcc70a818" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:32:14 np0005538960 nova_compute[187252]: 2025-11-28 16:32:14.634 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:15 np0005538960 nova_compute[187252]: 2025-11-28 16:32:15.098 187256 DEBUG nova.network.neutron [-] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:32:15 np0005538960 nova_compute[187252]: 2025-11-28 16:32:15.241 187256 INFO nova.compute.manager [-] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Took 5.17 seconds to deallocate network for instance.#033[00m
Nov 28 11:32:15 np0005538960 nova_compute[187252]: 2025-11-28 16:32:15.468 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:15 np0005538960 nova_compute[187252]: 2025-11-28 16:32:15.469 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:16 np0005538960 podman[222762]: 2025-11-28 16:32:16.221240484 +0000 UTC m=+0.127142886 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:32:16 np0005538960 nova_compute[187252]: 2025-11-28 16:32:16.560 187256 DEBUG nova.compute.provider_tree [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:32:16 np0005538960 nova_compute[187252]: 2025-11-28 16:32:16.599 187256 DEBUG nova.scheduler.client.report [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:32:16 np0005538960 nova_compute[187252]: 2025-11-28 16:32:16.616 187256 DEBUG nova.compute.manager [req-b915d15a-02c3-4c59-9d57-593e29654c2a req-82b90641-09a2-4776-aaa9-b21ff35ffdf2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Received event network-vif-deleted-ca1c4039-2c03-41ff-ab95-67b86f6e0ee9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:32:16 np0005538960 nova_compute[187252]: 2025-11-28 16:32:16.645 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:17 np0005538960 nova_compute[187252]: 2025-11-28 16:32:17.181 187256 INFO nova.scheduler.client.report [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Deleted allocations for instance 88ea9985-1aae-41bc-b36b-f2cfcc70a818#033[00m
Nov 28 11:32:17 np0005538960 nova_compute[187252]: 2025-11-28 16:32:17.454 187256 DEBUG oslo_concurrency.lockutils [None req-73e701ad-1a78-45bb-bbc6-98b6712bb22a 5d381eba17324dd5ad798648b82d0115 7e408bace48b41a1ac0677d300b6d288 - - default default] Lock "88ea9985-1aae-41bc-b36b-f2cfcc70a818" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:18 np0005538960 nova_compute[187252]: 2025-11-28 16:32:18.462 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:19 np0005538960 podman[222791]: 2025-11-28 16:32:19.1590199 +0000 UTC m=+0.061196636 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:32:19 np0005538960 podman[222790]: 2025-11-28 16:32:19.190087889 +0000 UTC m=+0.095210526 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 11:32:19 np0005538960 nova_compute[187252]: 2025-11-28 16:32:19.638 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:22 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:32:22.077 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:32:23 np0005538960 nova_compute[187252]: 2025-11-28 16:32:23.372 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:23 np0005538960 nova_compute[187252]: 2025-11-28 16:32:23.498 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:24 np0005538960 nova_compute[187252]: 2025-11-28 16:32:24.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:24 np0005538960 nova_compute[187252]: 2025-11-28 16:32:24.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:24 np0005538960 nova_compute[187252]: 2025-11-28 16:32:24.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:32:24 np0005538960 nova_compute[187252]: 2025-11-28 16:32:24.595 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347529.59379, 88ea9985-1aae-41bc-b36b-f2cfcc70a818 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:32:24 np0005538960 nova_compute[187252]: 2025-11-28 16:32:24.595 187256 INFO nova.compute.manager [-] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:32:24 np0005538960 nova_compute[187252]: 2025-11-28 16:32:24.662 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:24 np0005538960 nova_compute[187252]: 2025-11-28 16:32:24.684 187256 DEBUG nova.compute.manager [None req-fa53ce7c-92fa-480a-b0ad-0688d7213331 - - - - - -] [instance: 88ea9985-1aae-41bc-b36b-f2cfcc70a818] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:32:24 np0005538960 podman[222831]: 2025-11-28 16:32:24.730661706 +0000 UTC m=+0.051351349 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:32:26 np0005538960 nova_compute[187252]: 2025-11-28 16:32:26.404 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:27 np0005538960 nova_compute[187252]: 2025-11-28 16:32:27.016 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:27 np0005538960 nova_compute[187252]: 2025-11-28 16:32:27.182 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:27 np0005538960 nova_compute[187252]: 2025-11-28 16:32:27.337 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:27 np0005538960 nova_compute[187252]: 2025-11-28 16:32:27.337 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:32:27 np0005538960 nova_compute[187252]: 2025-11-28 16:32:27.337 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:32:27 np0005538960 nova_compute[187252]: 2025-11-28 16:32:27.348 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:32:28 np0005538960 podman[222856]: 2025-11-28 16:32:28.150242785 +0000 UTC m=+0.056169454 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, distribution-scope=public, name=ubi9-minimal)
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.339 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.490 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.491 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5708MB free_disk=73.33794403076172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.492 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.492 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.500 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.558 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.559 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.578 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.590 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.828 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:32:28 np0005538960 nova_compute[187252]: 2025-11-28 16:32:28.829 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:32:29 np0005538960 nova_compute[187252]: 2025-11-28 16:32:29.665 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:29 np0005538960 nova_compute[187252]: 2025-11-28 16:32:29.824 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:29 np0005538960 nova_compute[187252]: 2025-11-28 16:32:29.825 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:32 np0005538960 nova_compute[187252]: 2025-11-28 16:32:32.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:32:33 np0005538960 nova_compute[187252]: 2025-11-28 16:32:33.502 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:34 np0005538960 nova_compute[187252]: 2025-11-28 16:32:34.669 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:35 np0005538960 podman[222879]: 2025-11-28 16:32:35.147072763 +0000 UTC m=+0.052297142 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.312 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.313 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:32:35.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:32:38 np0005538960 nova_compute[187252]: 2025-11-28 16:32:38.541 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:39 np0005538960 nova_compute[187252]: 2025-11-28 16:32:39.673 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:40 np0005538960 podman[222899]: 2025-11-28 16:32:40.15110864 +0000 UTC m=+0.052662870 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:32:43 np0005538960 nova_compute[187252]: 2025-11-28 16:32:43.548 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:44 np0005538960 nova_compute[187252]: 2025-11-28 16:32:44.678 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:47 np0005538960 podman[222925]: 2025-11-28 16:32:47.263440662 +0000 UTC m=+0.146617884 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 11:32:48 np0005538960 nova_compute[187252]: 2025-11-28 16:32:48.551 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:49 np0005538960 nova_compute[187252]: 2025-11-28 16:32:49.681 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:50 np0005538960 podman[222954]: 2025-11-28 16:32:50.155555009 +0000 UTC m=+0.057623250 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:32:50 np0005538960 podman[222953]: 2025-11-28 16:32:50.168583462 +0000 UTC m=+0.054514555 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 11:32:53 np0005538960 nova_compute[187252]: 2025-11-28 16:32:53.554 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:54 np0005538960 nova_compute[187252]: 2025-11-28 16:32:54.684 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:55 np0005538960 podman[222993]: 2025-11-28 16:32:55.165189141 +0000 UTC m=+0.057653160 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:32:58 np0005538960 nova_compute[187252]: 2025-11-28 16:32:58.556 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:32:59 np0005538960 podman[223021]: 2025-11-28 16:32:59.166140401 +0000 UTC m=+0.069691129 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:32:59 np0005538960 nova_compute[187252]: 2025-11-28 16:32:59.687 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:03 np0005538960 nova_compute[187252]: 2025-11-28 16:33:03.559 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:04 np0005538960 nova_compute[187252]: 2025-11-28 16:33:04.690 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:06 np0005538960 podman[223042]: 2025-11-28 16:33:06.182498441 +0000 UTC m=+0.087931410 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 28 11:33:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:33:06.352 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:33:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:33:06.353 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:33:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:33:06.353 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:33:08 np0005538960 nova_compute[187252]: 2025-11-28 16:33:08.589 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:09 np0005538960 nova_compute[187252]: 2025-11-28 16:33:09.693 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:11 np0005538960 podman[223062]: 2025-11-28 16:33:11.177611684 +0000 UTC m=+0.085044861 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:33:13 np0005538960 nova_compute[187252]: 2025-11-28 16:33:13.591 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:14 np0005538960 nova_compute[187252]: 2025-11-28 16:33:14.696 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:14 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:33:14.726 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:33:14 np0005538960 nova_compute[187252]: 2025-11-28 16:33:14.727 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:14 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:33:14.728 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:33:18 np0005538960 podman[223087]: 2025-11-28 16:33:18.193512093 +0000 UTC m=+0.092120711 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Nov 28 11:33:18 np0005538960 nova_compute[187252]: 2025-11-28 16:33:18.593 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:19 np0005538960 nova_compute[187252]: 2025-11-28 16:33:19.699 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:21 np0005538960 podman[223113]: 2025-11-28 16:33:21.152941602 +0000 UTC m=+0.058562992 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:33:21 np0005538960 podman[223114]: 2025-11-28 16:33:21.164801417 +0000 UTC m=+0.063023199 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:33:23 np0005538960 nova_compute[187252]: 2025-11-28 16:33:23.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:23 np0005538960 nova_compute[187252]: 2025-11-28 16:33:23.595 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:24 np0005538960 nova_compute[187252]: 2025-11-28 16:33:24.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:24 np0005538960 nova_compute[187252]: 2025-11-28 16:33:24.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:24 np0005538960 nova_compute[187252]: 2025-11-28 16:33:24.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:33:24 np0005538960 nova_compute[187252]: 2025-11-28 16:33:24.702 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:24 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:33:24.730 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:33:26 np0005538960 podman[223153]: 2025-11-28 16:33:26.148130795 +0000 UTC m=+0.056849610 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:33:27 np0005538960 nova_compute[187252]: 2025-11-28 16:33:27.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:27 np0005538960 nova_compute[187252]: 2025-11-28 16:33:27.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:33:27 np0005538960 nova_compute[187252]: 2025-11-28 16:33:27.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:33:27 np0005538960 nova_compute[187252]: 2025-11-28 16:33:27.332 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:33:28 np0005538960 nova_compute[187252]: 2025-11-28 16:33:28.597 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:29 np0005538960 nova_compute[187252]: 2025-11-28 16:33:29.704 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:30 np0005538960 podman[223177]: 2025-11-28 16:33:30.155241365 +0000 UTC m=+0.060855066 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, vcs-type=git, name=ubi9-minimal, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.370 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.370 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.370 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.371 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.544 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.545 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5735MB free_disk=73.33792114257812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.545 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.546 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.609 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.610 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.651 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.668 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.670 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:33:30 np0005538960 nova_compute[187252]: 2025-11-28 16:33:30.670 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.124s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:33:31 np0005538960 nova_compute[187252]: 2025-11-28 16:33:31.670 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:33 np0005538960 nova_compute[187252]: 2025-11-28 16:33:33.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:33 np0005538960 nova_compute[187252]: 2025-11-28 16:33:33.598 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:34 np0005538960 nova_compute[187252]: 2025-11-28 16:33:34.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:33:34 np0005538960 nova_compute[187252]: 2025-11-28 16:33:34.708 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:37 np0005538960 podman[223198]: 2025-11-28 16:33:37.179949315 +0000 UTC m=+0.076467573 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Nov 28 11:33:38 np0005538960 nova_compute[187252]: 2025-11-28 16:33:38.647 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:39 np0005538960 nova_compute[187252]: 2025-11-28 16:33:39.710 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:42 np0005538960 podman[223218]: 2025-11-28 16:33:42.146855839 +0000 UTC m=+0.056515344 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:33:43 np0005538960 nova_compute[187252]: 2025-11-28 16:33:43.649 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:44 np0005538960 nova_compute[187252]: 2025-11-28 16:33:44.713 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:46 np0005538960 ovn_controller[95460]: 2025-11-28T16:33:46Z|00216|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Nov 28 11:33:48 np0005538960 nova_compute[187252]: 2025-11-28 16:33:48.654 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:49 np0005538960 podman[223241]: 2025-11-28 16:33:49.217970546 +0000 UTC m=+0.113139238 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:33:49 np0005538960 nova_compute[187252]: 2025-11-28 16:33:49.720 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:52 np0005538960 podman[223268]: 2025-11-28 16:33:52.155983697 +0000 UTC m=+0.057213309 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 11:33:52 np0005538960 podman[223267]: 2025-11-28 16:33:52.169724659 +0000 UTC m=+0.069343462 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251125)
Nov 28 11:33:53 np0005538960 nova_compute[187252]: 2025-11-28 16:33:53.656 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:54 np0005538960 nova_compute[187252]: 2025-11-28 16:33:54.725 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:57 np0005538960 podman[223309]: 2025-11-28 16:33:57.182062457 +0000 UTC m=+0.081388942 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:33:58 np0005538960 nova_compute[187252]: 2025-11-28 16:33:58.658 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:33:59 np0005538960 nova_compute[187252]: 2025-11-28 16:33:59.728 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:00 np0005538960 nova_compute[187252]: 2025-11-28 16:34:00.378 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:00 np0005538960 nova_compute[187252]: 2025-11-28 16:34:00.378 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:00 np0005538960 nova_compute[187252]: 2025-11-28 16:34:00.421 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:34:00 np0005538960 nova_compute[187252]: 2025-11-28 16:34:00.918 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:00 np0005538960 nova_compute[187252]: 2025-11-28 16:34:00.918 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:00 np0005538960 nova_compute[187252]: 2025-11-28 16:34:00.932 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:34:00 np0005538960 nova_compute[187252]: 2025-11-28 16:34:00.933 187256 INFO nova.compute.claims [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:34:01 np0005538960 nova_compute[187252]: 2025-11-28 16:34:01.108 187256 DEBUG nova.compute.provider_tree [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:34:01 np0005538960 nova_compute[187252]: 2025-11-28 16:34:01.135 187256 DEBUG nova.scheduler.client.report [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:34:01 np0005538960 podman[223334]: 2025-11-28 16:34:01.171148512 +0000 UTC m=+0.075276135 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, com.redhat.component=ubi9-minimal-container, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc.)
Nov 28 11:34:01 np0005538960 nova_compute[187252]: 2025-11-28 16:34:01.174 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.256s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:01 np0005538960 nova_compute[187252]: 2025-11-28 16:34:01.175 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:34:01 np0005538960 nova_compute[187252]: 2025-11-28 16:34:01.422 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:34:01 np0005538960 nova_compute[187252]: 2025-11-28 16:34:01.422 187256 DEBUG nova.network.neutron [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:34:01 np0005538960 nova_compute[187252]: 2025-11-28 16:34:01.649 187256 INFO nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.145 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.327 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.329 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.329 187256 INFO nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Creating image(s)#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.330 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "/var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.330 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "/var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.331 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "/var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.345 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.426 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.428 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.428 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.442 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.513 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.514 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.595 187256 DEBUG nova.policy [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.604 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk 1073741824" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.605 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.605 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.664 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.666 187256 DEBUG nova.virt.disk.api [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Checking if we can resize image /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.666 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.726 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.727 187256 DEBUG nova.virt.disk.api [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Cannot resize image /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.728 187256 DEBUG nova.objects.instance [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'migration_context' on Instance uuid b890bd95-f884-4215-91f2-749834092bc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.768 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.769 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Ensure instance console log exists: /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.769 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.770 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:02 np0005538960 nova_compute[187252]: 2025-11-28 16:34:02.770 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:03 np0005538960 nova_compute[187252]: 2025-11-28 16:34:03.661 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:04 np0005538960 nova_compute[187252]: 2025-11-28 16:34:04.731 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:05 np0005538960 nova_compute[187252]: 2025-11-28 16:34:05.414 187256 DEBUG nova.network.neutron [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Successfully created port: c2abe47d-718c-4e51-b661-823b9fa7add9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:34:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:06.353 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:06.354 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:06.354 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:07 np0005538960 nova_compute[187252]: 2025-11-28 16:34:07.239 187256 DEBUG nova.network.neutron [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Successfully created port: ab0766c8-c7bc-4af1-9cc0-971475b014b7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:34:08 np0005538960 podman[223370]: 2025-11-28 16:34:08.182169112 +0000 UTC m=+0.088379502 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3)
Nov 28 11:34:08 np0005538960 nova_compute[187252]: 2025-11-28 16:34:08.663 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:09 np0005538960 nova_compute[187252]: 2025-11-28 16:34:09.635 187256 DEBUG nova.network.neutron [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Successfully updated port: c2abe47d-718c-4e51-b661-823b9fa7add9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:34:09 np0005538960 nova_compute[187252]: 2025-11-28 16:34:09.734 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:09 np0005538960 nova_compute[187252]: 2025-11-28 16:34:09.803 187256 DEBUG nova.compute.manager [req-a7db2006-92f9-4d01-b997-d8ff5e4b6099 req-6acb4840-9bc5-4e8f-b00e-27f2fd03ee40 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-changed-c2abe47d-718c-4e51-b661-823b9fa7add9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:34:09 np0005538960 nova_compute[187252]: 2025-11-28 16:34:09.803 187256 DEBUG nova.compute.manager [req-a7db2006-92f9-4d01-b997-d8ff5e4b6099 req-6acb4840-9bc5-4e8f-b00e-27f2fd03ee40 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Refreshing instance network info cache due to event network-changed-c2abe47d-718c-4e51-b661-823b9fa7add9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:34:09 np0005538960 nova_compute[187252]: 2025-11-28 16:34:09.804 187256 DEBUG oslo_concurrency.lockutils [req-a7db2006-92f9-4d01-b997-d8ff5e4b6099 req-6acb4840-9bc5-4e8f-b00e-27f2fd03ee40 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:34:09 np0005538960 nova_compute[187252]: 2025-11-28 16:34:09.804 187256 DEBUG oslo_concurrency.lockutils [req-a7db2006-92f9-4d01-b997-d8ff5e4b6099 req-6acb4840-9bc5-4e8f-b00e-27f2fd03ee40 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:34:09 np0005538960 nova_compute[187252]: 2025-11-28 16:34:09.804 187256 DEBUG nova.network.neutron [req-a7db2006-92f9-4d01-b997-d8ff5e4b6099 req-6acb4840-9bc5-4e8f-b00e-27f2fd03ee40 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Refreshing network info cache for port c2abe47d-718c-4e51-b661-823b9fa7add9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:34:10 np0005538960 nova_compute[187252]: 2025-11-28 16:34:10.343 187256 DEBUG nova.network.neutron [req-a7db2006-92f9-4d01-b997-d8ff5e4b6099 req-6acb4840-9bc5-4e8f-b00e-27f2fd03ee40 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:34:10 np0005538960 nova_compute[187252]: 2025-11-28 16:34:10.785 187256 DEBUG nova.network.neutron [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Successfully updated port: ab0766c8-c7bc-4af1-9cc0-971475b014b7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:34:10 np0005538960 nova_compute[187252]: 2025-11-28 16:34:10.814 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:34:10 np0005538960 nova_compute[187252]: 2025-11-28 16:34:10.920 187256 DEBUG nova.network.neutron [req-a7db2006-92f9-4d01-b997-d8ff5e4b6099 req-6acb4840-9bc5-4e8f-b00e-27f2fd03ee40 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:34:10 np0005538960 nova_compute[187252]: 2025-11-28 16:34:10.953 187256 DEBUG oslo_concurrency.lockutils [req-a7db2006-92f9-4d01-b997-d8ff5e4b6099 req-6acb4840-9bc5-4e8f-b00e-27f2fd03ee40 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:34:10 np0005538960 nova_compute[187252]: 2025-11-28 16:34:10.954 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquired lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:34:10 np0005538960 nova_compute[187252]: 2025-11-28 16:34:10.955 187256 DEBUG nova.network.neutron [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:34:11 np0005538960 nova_compute[187252]: 2025-11-28 16:34:11.484 187256 DEBUG nova.network.neutron [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:34:12 np0005538960 nova_compute[187252]: 2025-11-28 16:34:12.231 187256 DEBUG nova.compute.manager [req-5613fd74-1aef-4fc8-b61c-2c9ac557e87d req-e6d65299-bcd1-437b-97a7-6dfdafeff998 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-changed-ab0766c8-c7bc-4af1-9cc0-971475b014b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:34:12 np0005538960 nova_compute[187252]: 2025-11-28 16:34:12.232 187256 DEBUG nova.compute.manager [req-5613fd74-1aef-4fc8-b61c-2c9ac557e87d req-e6d65299-bcd1-437b-97a7-6dfdafeff998 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Refreshing instance network info cache due to event network-changed-ab0766c8-c7bc-4af1-9cc0-971475b014b7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:34:12 np0005538960 nova_compute[187252]: 2025-11-28 16:34:12.232 187256 DEBUG oslo_concurrency.lockutils [req-5613fd74-1aef-4fc8-b61c-2c9ac557e87d req-e6d65299-bcd1-437b-97a7-6dfdafeff998 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:34:13 np0005538960 podman[223391]: 2025-11-28 16:34:13.149870683 +0000 UTC m=+0.055259963 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:34:13 np0005538960 nova_compute[187252]: 2025-11-28 16:34:13.665 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:14 np0005538960 nova_compute[187252]: 2025-11-28 16:34:14.738 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.604 187256 DEBUG nova.network.neutron [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updating instance_info_cache with network_info: [{"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.639 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Releasing lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.640 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Instance network_info: |[{"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.641 187256 DEBUG oslo_concurrency.lockutils [req-5613fd74-1aef-4fc8-b61c-2c9ac557e87d req-e6d65299-bcd1-437b-97a7-6dfdafeff998 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.641 187256 DEBUG nova.network.neutron [req-5613fd74-1aef-4fc8-b61c-2c9ac557e87d req-e6d65299-bcd1-437b-97a7-6dfdafeff998 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Refreshing network info cache for port ab0766c8-c7bc-4af1-9cc0-971475b014b7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.644 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Start _get_guest_xml network_info=[{"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.649 187256 WARNING nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.658 187256 DEBUG nova.virt.libvirt.host [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.659 187256 DEBUG nova.virt.libvirt.host [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.669 187256 DEBUG nova.virt.libvirt.host [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.670 187256 DEBUG nova.virt.libvirt.host [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.671 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.672 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.672 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.673 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.674 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.674 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.675 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.675 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.675 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.675 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.676 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.676 187256 DEBUG nova.virt.hardware [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.680 187256 DEBUG nova.virt.libvirt.vif [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-894824609',display_name='tempest-TestGettingAddress-server-894824609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-894824609',id=48,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFktOrX6gF2AzWhwO+nGvoS57OBSP3laYDHq62fCTtGv8c+DoaFIxSAOh6oCv+DiEK35kK0uU+oYfnbqBnIodHTIIADd7iRQsOEXhskcFfG472xjkhS/wB+Vvdgt7W/5jg==',key_name='tempest-TestGettingAddress-1253484732',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-eejrb9bi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:34:02Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=b890bd95-f884-4215-91f2-749834092bc1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.681 187256 DEBUG nova.network.os_vif_util [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.682 187256 DEBUG nova.network.os_vif_util [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:f4:04,bridge_name='br-int',has_traffic_filtering=True,id=c2abe47d-718c-4e51-b661-823b9fa7add9,network=Network(f167768e-3551-41c3-a3de-da1c1ed19a2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2abe47d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.682 187256 DEBUG nova.virt.libvirt.vif [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-894824609',display_name='tempest-TestGettingAddress-server-894824609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-894824609',id=48,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFktOrX6gF2AzWhwO+nGvoS57OBSP3laYDHq62fCTtGv8c+DoaFIxSAOh6oCv+DiEK35kK0uU+oYfnbqBnIodHTIIADd7iRQsOEXhskcFfG472xjkhS/wB+Vvdgt7W/5jg==',key_name='tempest-TestGettingAddress-1253484732',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-eejrb9bi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:34:02Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=b890bd95-f884-4215-91f2-749834092bc1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.683 187256 DEBUG nova.network.os_vif_util [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.683 187256 DEBUG nova.network.os_vif_util [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:d1:2c,bridge_name='br-int',has_traffic_filtering=True,id=ab0766c8-c7bc-4af1-9cc0-971475b014b7,network=Network(c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0766c8-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.684 187256 DEBUG nova.objects.instance [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'pci_devices' on Instance uuid b890bd95-f884-4215-91f2-749834092bc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.702 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <uuid>b890bd95-f884-4215-91f2-749834092bc1</uuid>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <name>instance-00000030</name>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestGettingAddress-server-894824609</nova:name>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:34:17</nova:creationTime>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:user uuid="23b8e0c173df4c2883fccd8cb472e427">tempest-TestGettingAddress-2054466537-project-member</nova:user>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:project uuid="b5f802fe6e0b4d62bba6143515207a40">tempest-TestGettingAddress-2054466537</nova:project>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:port uuid="c2abe47d-718c-4e51-b661-823b9fa7add9">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        <nova:port uuid="ab0766c8-c7bc-4af1-9cc0-971475b014b7">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe3f:d12c" ipVersion="6"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <entry name="serial">b890bd95-f884-4215-91f2-749834092bc1</entry>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <entry name="uuid">b890bd95-f884-4215-91f2-749834092bc1</entry>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk.config"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:24:f4:04"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <target dev="tapc2abe47d-71"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:3f:d1:2c"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <target dev="tapab0766c8-c7"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/console.log" append="off"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:34:17 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:34:17 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:34:17 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:34:17 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.704 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Preparing to wait for external event network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.705 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.705 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.705 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.705 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Preparing to wait for external event network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.706 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.706 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.706 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.707 187256 DEBUG nova.virt.libvirt.vif [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-894824609',display_name='tempest-TestGettingAddress-server-894824609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-894824609',id=48,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFktOrX6gF2AzWhwO+nGvoS57OBSP3laYDHq62fCTtGv8c+DoaFIxSAOh6oCv+DiEK35kK0uU+oYfnbqBnIodHTIIADd7iRQsOEXhskcFfG472xjkhS/wB+Vvdgt7W/5jg==',key_name='tempest-TestGettingAddress-1253484732',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-eejrb9bi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:34:02Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=b890bd95-f884-4215-91f2-749834092bc1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.707 187256 DEBUG nova.network.os_vif_util [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.708 187256 DEBUG nova.network.os_vif_util [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:f4:04,bridge_name='br-int',has_traffic_filtering=True,id=c2abe47d-718c-4e51-b661-823b9fa7add9,network=Network(f167768e-3551-41c3-a3de-da1c1ed19a2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2abe47d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.708 187256 DEBUG os_vif [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:f4:04,bridge_name='br-int',has_traffic_filtering=True,id=c2abe47d-718c-4e51-b661-823b9fa7add9,network=Network(f167768e-3551-41c3-a3de-da1c1ed19a2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2abe47d-71') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.709 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.709 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.710 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.714 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.714 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2abe47d-71, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.714 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc2abe47d-71, col_values=(('external_ids', {'iface-id': 'c2abe47d-718c-4e51-b661-823b9fa7add9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:f4:04', 'vm-uuid': 'b890bd95-f884-4215-91f2-749834092bc1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.716 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 NetworkManager[55548]: <info>  [1764347657.7169] manager: (tapc2abe47d-71): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.718 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.725 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.726 187256 INFO os_vif [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:f4:04,bridge_name='br-int',has_traffic_filtering=True,id=c2abe47d-718c-4e51-b661-823b9fa7add9,network=Network(f167768e-3551-41c3-a3de-da1c1ed19a2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2abe47d-71')#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.727 187256 DEBUG nova.virt.libvirt.vif [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-894824609',display_name='tempest-TestGettingAddress-server-894824609',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-894824609',id=48,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFktOrX6gF2AzWhwO+nGvoS57OBSP3laYDHq62fCTtGv8c+DoaFIxSAOh6oCv+DiEK35kK0uU+oYfnbqBnIodHTIIADd7iRQsOEXhskcFfG472xjkhS/wB+Vvdgt7W/5jg==',key_name='tempest-TestGettingAddress-1253484732',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-eejrb9bi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:34:02Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=b890bd95-f884-4215-91f2-749834092bc1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.727 187256 DEBUG nova.network.os_vif_util [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.728 187256 DEBUG nova.network.os_vif_util [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3f:d1:2c,bridge_name='br-int',has_traffic_filtering=True,id=ab0766c8-c7bc-4af1-9cc0-971475b014b7,network=Network(c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0766c8-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.729 187256 DEBUG os_vif [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:d1:2c,bridge_name='br-int',has_traffic_filtering=True,id=ab0766c8-c7bc-4af1-9cc0-971475b014b7,network=Network(c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0766c8-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.729 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.729 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.730 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.732 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.732 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab0766c8-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.733 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab0766c8-c7, col_values=(('external_ids', {'iface-id': 'ab0766c8-c7bc-4af1-9cc0-971475b014b7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3f:d1:2c', 'vm-uuid': 'b890bd95-f884-4215-91f2-749834092bc1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.734 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 NetworkManager[55548]: <info>  [1764347657.7349] manager: (tapab0766c8-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.736 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.741 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.742 187256 INFO os_vif [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3f:d1:2c,bridge_name='br-int',has_traffic_filtering=True,id=ab0766c8-c7bc-4af1-9cc0-971475b014b7,network=Network(c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0766c8-c7')#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.799 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.800 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.800 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No VIF found with MAC fa:16:3e:24:f4:04, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.800 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No VIF found with MAC fa:16:3e:3f:d1:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:34:17 np0005538960 nova_compute[187252]: 2025-11-28 16:34:17.801 187256 INFO nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Using config drive#033[00m
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.379 187256 INFO nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Creating config drive at /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk.config#033[00m
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.385 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp63orzjnz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.511 187256 DEBUG oslo_concurrency.processutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp63orzjnz" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.5816] manager: (tapc2abe47d-71): new Tun device (/org/freedesktop/NetworkManager/Devices/116)
Nov 28 11:34:18 np0005538960 kernel: tapc2abe47d-71: entered promiscuous mode
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00217|binding|INFO|Claiming lport c2abe47d-718c-4e51-b661-823b9fa7add9 for this chassis.
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00218|binding|INFO|c2abe47d-718c-4e51-b661-823b9fa7add9: Claiming fa:16:3e:24:f4:04 10.100.0.12
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.584 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.6006] manager: (tapab0766c8-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/117)
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.601 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:18 np0005538960 kernel: tapab0766c8-c7: entered promiscuous mode
Nov 28 11:34:18 np0005538960 systemd-udevd[223440]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:34:18 np0005538960 systemd-udevd[223441]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.627 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:f4:04 10.100.0.12'], port_security=['fa:16:3e:24:f4:04 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f167768e-3551-41c3-a3de-da1c1ed19a2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'efa51696-1cad-4945-b794-623719fa4d3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ea0ae4f-79c3-4737-b920-c386035d0846, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=c2abe47d-718c-4e51-b661-823b9fa7add9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.629 104369 INFO neutron.agent.ovn.metadata.agent [-] Port c2abe47d-718c-4e51-b661-823b9fa7add9 in datapath f167768e-3551-41c3-a3de-da1c1ed19a2c bound to our chassis#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.631 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f167768e-3551-41c3-a3de-da1c1ed19a2c#033[00m
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.6338] device (tapc2abe47d-71): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.6350] device (tapc2abe47d-71): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.6383] device (tapab0766c8-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.6394] device (tapab0766c8-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.642 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[264197cd-3901-4702-9244-bc95546b7397]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.644 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf167768e-31 in ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.647 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf167768e-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.647 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[361be19c-094a-414d-9e2d-00ce686269b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.648 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ae16eb1c-8326-4d24-b75b-39a135896e35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 systemd-machined[153518]: New machine qemu-18-instance-00000030.
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.659 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[ab91e5b0-e840-45ac-9803-2910e7f984b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00219|binding|INFO|Claiming lport ab0766c8-c7bc-4af1-9cc0-971475b014b7 for this chassis.
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00220|binding|INFO|ab0766c8-c7bc-4af1-9cc0-971475b014b7: Claiming fa:16:3e:3f:d1:2c 2001:db8::f816:3eff:fe3f:d12c
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.676 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.681 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:18 np0005538960 systemd[1]: Started Virtual Machine qemu-18-instance-00000030.
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.688 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[53b700ca-6bd9-4c74-962c-e2005638475b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00221|binding|INFO|Setting lport c2abe47d-718c-4e51-b661-823b9fa7add9 ovn-installed in OVS
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.691 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00222|binding|INFO|Setting lport c2abe47d-718c-4e51-b661-823b9fa7add9 up in Southbound
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.696 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:d1:2c 2001:db8::f816:3eff:fe3f:d12c'], port_security=['fa:16:3e:3f:d1:2c 2001:db8::f816:3eff:fe3f:d12c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:d12c/64', 'neutron:device_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'efa51696-1cad-4945-b794-623719fa4d3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30aaa9d3-aada-4fac-a64f-fec159b96017, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=ab0766c8-c7bc-4af1-9cc0-971475b014b7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00223|binding|INFO|Setting lport ab0766c8-c7bc-4af1-9cc0-971475b014b7 ovn-installed in OVS
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.701 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.717 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[aa4181f5-1c42-48ca-bda2-9c51e557a524]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00224|binding|INFO|Setting lport ab0766c8-c7bc-4af1-9cc0-971475b014b7 up in Southbound
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.728 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[8c44a39d-4c4b-4f51-89eb-a9c9755fe6c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.7293] manager: (tapf167768e-30): new Veth device (/org/freedesktop/NetworkManager/Devices/118)
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.763 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[f88b526a-8aad-4d12-aa28-6aa35577f50b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.766 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[77220b07-1769-45d0-b364-d8ebce3ee898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.7876] device (tapf167768e-30): carrier: link connected
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.791 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[0f637290-16df-4c05-8c7f-fe8583655c85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.809 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d71235f5-a24e-4a2e-b5e9-d752218bd053]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf167768e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:e8:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475836, 'reachable_time': 39073, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223480, 'error': None, 'target': 'ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.826 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[26e153b3-b2a5-49e7-8a48-251556f3b1d6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb6:e80f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475836, 'tstamp': 475836}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223481, 'error': None, 'target': 'ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.845 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[05144239-f160-436d-9d16-9e9b37cba2b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf167768e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:e8:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 68], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475836, 'reachable_time': 39073, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223482, 'error': None, 'target': 'ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.884 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a7503507-f02a-4af4-afc5-2efbbaa7ceb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.958 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2c35e812-0ea2-49b2-9e76-ad731d6a6b3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.960 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf167768e-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.961 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.961 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf167768e-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:18 np0005538960 NetworkManager[55548]: <info>  [1764347658.9647] manager: (tapf167768e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Nov 28 11:34:18 np0005538960 kernel: tapf167768e-30: entered promiscuous mode
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.966 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf167768e-30, col_values=(('external_ids', {'iface-id': '2c360ecb-6340-4b7d-b161-50246b8a26c6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:18 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:18Z|00225|binding|INFO|Releasing lport 2c360ecb-6340-4b7d-b161-50246b8a26c6 from this chassis (sb_readonly=0)
Nov 28 11:34:18 np0005538960 nova_compute[187252]: 2025-11-28 16:34:18.986 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.990 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f167768e-3551-41c3-a3de-da1c1ed19a2c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f167768e-3551-41c3-a3de-da1c1ed19a2c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.991 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[29bc1067-f066-4639-b1b3-ac1795bc0173]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.992 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-f167768e-3551-41c3-a3de-da1c1ed19a2c
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/f167768e-3551-41c3-a3de-da1c1ed19a2c.pid.haproxy
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID f167768e-3551-41c3-a3de-da1c1ed19a2c
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:34:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:18.992 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c', 'env', 'PROCESS_TAG=haproxy-f167768e-3551-41c3-a3de-da1c1ed19a2c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f167768e-3551-41c3-a3de-da1c1ed19a2c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.122 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347659.1215177, b890bd95-f884-4215-91f2-749834092bc1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.122 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] VM Started (Lifecycle Event)#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.152 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.164 187256 DEBUG nova.compute.manager [req-4d7dfadd-025e-44ed-9a38-ba81e2afd596 req-48e8fb60-ed3d-4bc6-b98f-76c177f0ad89 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.164 187256 DEBUG oslo_concurrency.lockutils [req-4d7dfadd-025e-44ed-9a38-ba81e2afd596 req-48e8fb60-ed3d-4bc6-b98f-76c177f0ad89 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.165 187256 DEBUG oslo_concurrency.lockutils [req-4d7dfadd-025e-44ed-9a38-ba81e2afd596 req-48e8fb60-ed3d-4bc6-b98f-76c177f0ad89 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.165 187256 DEBUG oslo_concurrency.lockutils [req-4d7dfadd-025e-44ed-9a38-ba81e2afd596 req-48e8fb60-ed3d-4bc6-b98f-76c177f0ad89 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.165 187256 DEBUG nova.compute.manager [req-4d7dfadd-025e-44ed-9a38-ba81e2afd596 req-48e8fb60-ed3d-4bc6-b98f-76c177f0ad89 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Processing event network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.168 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347659.1248486, b890bd95-f884-4215-91f2-749834092bc1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.169 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.202 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.208 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.235 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:34:19 np0005538960 podman[223522]: 2025-11-28 16:34:19.38078291 +0000 UTC m=+0.026266074 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:34:19 np0005538960 podman[223522]: 2025-11-28 16:34:19.648525373 +0000 UTC m=+0.294008517 container create 19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 11:34:19 np0005538960 systemd[1]: Started libpod-conmon-19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621.scope.
Nov 28 11:34:19 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:34:19 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fc1b0a6ecd058275e21f89c7b965aef03667e29e48522afc9f914e9830427cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.771 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:34:19 np0005538960 nova_compute[187252]: 2025-11-28 16:34:19.772 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:19 np0005538960 podman[223522]: 2025-11-28 16:34:19.776031216 +0000 UTC m=+0.421514420 container init 19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 11:34:19 np0005538960 podman[223522]: 2025-11-28 16:34:19.783011954 +0000 UTC m=+0.428495098 container start 19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Nov 28 11:34:19 np0005538960 neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c[223550]: [NOTICE]   (223567) : New worker (223569) forked
Nov 28 11:34:19 np0005538960 neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c[223550]: [NOTICE]   (223567) : Loading success.
Nov 28 11:34:19 np0005538960 podman[223535]: 2025-11-28 16:34:19.901272525 +0000 UTC m=+0.219042791 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.947 104369 INFO neutron.agent.ovn.metadata.agent [-] Port ab0766c8-c7bc-4af1-9cc0-971475b014b7 in datapath c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da unbound from our chassis#033[00m
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.949 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da#033[00m
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.964 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[2051ed5b-0efe-4d29-9096-17f356ce60a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.965 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc053bf8e-51 in ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.968 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc053bf8e-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.968 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e39a523f-538d-4831-914b-f698c1b5a368]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.969 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[750ebd36-dc33-421e-8560-eb666cbd6a73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.981 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4e259d-1b7c-4301-9229-80e29b1c61fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:19 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:19.996 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[af01b17f-09f0-4c40-89a7-c7ebfebba22a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.032 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[5abb1e50-2153-4397-814e-de7f58b4a0d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.036 187256 DEBUG nova.compute.manager [req-21dcfa37-28ed-4b2f-8840-80026cbdf729 req-2d9f62d2-f14f-461c-944c-d3e8bdb2c47c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.036 187256 DEBUG oslo_concurrency.lockutils [req-21dcfa37-28ed-4b2f-8840-80026cbdf729 req-2d9f62d2-f14f-461c-944c-d3e8bdb2c47c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.037 187256 DEBUG oslo_concurrency.lockutils [req-21dcfa37-28ed-4b2f-8840-80026cbdf729 req-2d9f62d2-f14f-461c-944c-d3e8bdb2c47c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.037 187256 DEBUG oslo_concurrency.lockutils [req-21dcfa37-28ed-4b2f-8840-80026cbdf729 req-2d9f62d2-f14f-461c-944c-d3e8bdb2c47c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.037 187256 DEBUG nova.compute.manager [req-21dcfa37-28ed-4b2f-8840-80026cbdf729 req-2d9f62d2-f14f-461c-944c-d3e8bdb2c47c 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Processing event network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.038 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.039 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[f05ae503-b92e-418d-827a-ff51cb11a00d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 NetworkManager[55548]: <info>  [1764347660.0406] manager: (tapc053bf8e-50): new Veth device (/org/freedesktop/NetworkManager/Devices/120)
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.048 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347660.047862, b890bd95-f884-4215-91f2-749834092bc1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.049 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.051 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.060 187256 INFO nova.virt.libvirt.driver [-] [instance: b890bd95-f884-4215-91f2-749834092bc1] Instance spawned successfully.#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.061 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.077 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[592ac2de-96e7-4205-b9e5-74d89e7aff6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.080 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[1594af41-4920-4bbc-a30a-b3486bc8a013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.090 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.096 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.100 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.100 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.101 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.101 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.102 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.102 187256 DEBUG nova.virt.libvirt.driver [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:34:20 np0005538960 NetworkManager[55548]: <info>  [1764347660.1119] device (tapc053bf8e-50): carrier: link connected
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.118 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[a93b44e6-49ca-4375-94c0-e2832b36326e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.135 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.138 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[13ad6ed0-952b-4740-ac06-4b972d998aef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc053bf8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:cf:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475969, 'reachable_time': 21482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223589, 'error': None, 'target': 'ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.155 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[eb1ad799-c17e-43e3-8e93-2389e1ec3a46]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe70:cf29'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 475969, 'tstamp': 475969}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223590, 'error': None, 'target': 'ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.174 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9e11cd4b-bc6b-4063-b510-e8ea14d243e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc053bf8e-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:70:cf:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475969, 'reachable_time': 21482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223591, 'error': None, 'target': 'ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.202 187256 INFO nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Took 17.87 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.203 187256 DEBUG nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.205 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[d0835416-9249-448d-bcbc-102f34da40e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.237 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[064c362c-6f83-45b4-8550-e3250e61c362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.239 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc053bf8e-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.239 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.240 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc053bf8e-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.242 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:20 np0005538960 NetworkManager[55548]: <info>  [1764347660.2428] manager: (tapc053bf8e-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Nov 28 11:34:20 np0005538960 kernel: tapc053bf8e-50: entered promiscuous mode
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.245 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.247 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc053bf8e-50, col_values=(('external_ids', {'iface-id': '4251436a-ae80-429d-9ed3-db2fe2ff59d6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.248 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:20 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:20Z|00226|binding|INFO|Releasing lport 4251436a-ae80-429d-9ed3-db2fe2ff59d6 from this chassis (sb_readonly=0)
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.251 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.252 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[68ded1a4-8dfc-4ec4-80ff-ece4a6ac31f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.253 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da.pid.haproxy
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:34:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:20.254 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da', 'env', 'PROCESS_TAG=haproxy-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.261 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.386 187256 INFO nova.compute.manager [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Took 19.88 seconds to build instance.#033[00m
Nov 28 11:34:20 np0005538960 nova_compute[187252]: 2025-11-28 16:34:20.431 187256 DEBUG oslo_concurrency.lockutils [None req-9073e833-f3df-4f52-9034-d683187cf5d7 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:20 np0005538960 podman[223622]: 2025-11-28 16:34:20.608770227 +0000 UTC m=+0.024130012 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:34:20 np0005538960 podman[223622]: 2025-11-28 16:34:20.743256679 +0000 UTC m=+0.158616434 container create 9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Nov 28 11:34:20 np0005538960 systemd[1]: Started libpod-conmon-9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9.scope.
Nov 28 11:34:20 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:34:20 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6b2b0cb4615a1e6653117f75d19d2f2e9a7dcd96534e9bfd64744e613eb081b0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:34:20 np0005538960 podman[223622]: 2025-11-28 16:34:20.851518307 +0000 UTC m=+0.266878112 container init 9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:34:20 np0005538960 podman[223622]: 2025-11-28 16:34:20.858968628 +0000 UTC m=+0.274328403 container start 9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS)
Nov 28 11:34:20 np0005538960 neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da[223637]: [NOTICE]   (223641) : New worker (223643) forked
Nov 28 11:34:20 np0005538960 neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da[223637]: [NOTICE]   (223641) : Loading success.
Nov 28 11:34:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:21.118 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:34:21 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:34:21.119 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.169 187256 DEBUG nova.network.neutron [req-5613fd74-1aef-4fc8-b61c-2c9ac557e87d req-e6d65299-bcd1-437b-97a7-6dfdafeff998 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updated VIF entry in instance network info cache for port ab0766c8-c7bc-4af1-9cc0-971475b014b7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.171 187256 DEBUG nova.network.neutron [req-5613fd74-1aef-4fc8-b61c-2c9ac557e87d req-e6d65299-bcd1-437b-97a7-6dfdafeff998 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updating instance_info_cache with network_info: [{"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.395 187256 DEBUG oslo_concurrency.lockutils [req-5613fd74-1aef-4fc8-b61c-2c9ac557e87d req-e6d65299-bcd1-437b-97a7-6dfdafeff998 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.480 187256 DEBUG nova.compute.manager [req-d6b98217-4683-4037-8c27-18e12ecbf381 req-b1ad9e82-aa20-4605-be37-c49b5ba9b512 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.481 187256 DEBUG oslo_concurrency.lockutils [req-d6b98217-4683-4037-8c27-18e12ecbf381 req-b1ad9e82-aa20-4605-be37-c49b5ba9b512 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.481 187256 DEBUG oslo_concurrency.lockutils [req-d6b98217-4683-4037-8c27-18e12ecbf381 req-b1ad9e82-aa20-4605-be37-c49b5ba9b512 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.482 187256 DEBUG oslo_concurrency.lockutils [req-d6b98217-4683-4037-8c27-18e12ecbf381 req-b1ad9e82-aa20-4605-be37-c49b5ba9b512 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.482 187256 DEBUG nova.compute.manager [req-d6b98217-4683-4037-8c27-18e12ecbf381 req-b1ad9e82-aa20-4605-be37-c49b5ba9b512 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] No waiting events found dispatching network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:34:21 np0005538960 nova_compute[187252]: 2025-11-28 16:34:21.482 187256 WARNING nova.compute.manager [req-d6b98217-4683-4037-8c27-18e12ecbf381 req-b1ad9e82-aa20-4605-be37-c49b5ba9b512 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received unexpected event network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:34:22 np0005538960 nova_compute[187252]: 2025-11-28 16:34:22.156 187256 DEBUG nova.compute.manager [req-e688df71-9da3-4930-a45e-da3653ca8e1c req-dcf0a75a-43b9-4f3a-b9da-39bec1fb2d87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:34:22 np0005538960 nova_compute[187252]: 2025-11-28 16:34:22.157 187256 DEBUG oslo_concurrency.lockutils [req-e688df71-9da3-4930-a45e-da3653ca8e1c req-dcf0a75a-43b9-4f3a-b9da-39bec1fb2d87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:22 np0005538960 nova_compute[187252]: 2025-11-28 16:34:22.157 187256 DEBUG oslo_concurrency.lockutils [req-e688df71-9da3-4930-a45e-da3653ca8e1c req-dcf0a75a-43b9-4f3a-b9da-39bec1fb2d87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:22 np0005538960 nova_compute[187252]: 2025-11-28 16:34:22.157 187256 DEBUG oslo_concurrency.lockutils [req-e688df71-9da3-4930-a45e-da3653ca8e1c req-dcf0a75a-43b9-4f3a-b9da-39bec1fb2d87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:22 np0005538960 nova_compute[187252]: 2025-11-28 16:34:22.158 187256 DEBUG nova.compute.manager [req-e688df71-9da3-4930-a45e-da3653ca8e1c req-dcf0a75a-43b9-4f3a-b9da-39bec1fb2d87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] No waiting events found dispatching network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:34:22 np0005538960 nova_compute[187252]: 2025-11-28 16:34:22.158 187256 WARNING nova.compute.manager [req-e688df71-9da3-4930-a45e-da3653ca8e1c req-dcf0a75a-43b9-4f3a-b9da-39bec1fb2d87 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received unexpected event network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 for instance with vm_state active and task_state None.#033[00m
Nov 28 11:34:22 np0005538960 podman[223655]: 2025-11-28 16:34:22.644906772 +0000 UTC m=+0.053558522 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:34:22 np0005538960 podman[223654]: 2025-11-28 16:34:22.663965392 +0000 UTC m=+0.077633113 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:34:22 np0005538960 nova_compute[187252]: 2025-11-28 16:34:22.734 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:23 np0005538960 nova_compute[187252]: 2025-11-28 16:34:23.689 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:25 np0005538960 nova_compute[187252]: 2025-11-28 16:34:25.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:25 np0005538960 nova_compute[187252]: 2025-11-28 16:34:25.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:26 np0005538960 nova_compute[187252]: 2025-11-28 16:34:26.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:26 np0005538960 nova_compute[187252]: 2025-11-28 16:34:26.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:34:27 np0005538960 nova_compute[187252]: 2025-11-28 16:34:27.736 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:28 np0005538960 podman[223690]: 2025-11-28 16:34:28.151371469 +0000 UTC m=+0.049788811 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:34:28 np0005538960 nova_compute[187252]: 2025-11-28 16:34:28.692 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.318 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.318 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.318 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.680 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:29 np0005538960 NetworkManager[55548]: <info>  [1764347669.6807] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Nov 28 11:34:29 np0005538960 NetworkManager[55548]: <info>  [1764347669.6814] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.700 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.701 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.701 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.702 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b890bd95-f884-4215-91f2-749834092bc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.805 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:29 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:29Z|00227|binding|INFO|Releasing lport 2c360ecb-6340-4b7d-b161-50246b8a26c6 from this chassis (sb_readonly=0)
Nov 28 11:34:29 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:29Z|00228|binding|INFO|Releasing lport 4251436a-ae80-429d-9ed3-db2fe2ff59d6 from this chassis (sb_readonly=0)
Nov 28 11:34:29 np0005538960 nova_compute[187252]: 2025-11-28 16:34:29.826 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:31 np0005538960 nova_compute[187252]: 2025-11-28 16:34:31.654 187256 DEBUG nova.compute.manager [req-1eaee403-36c4-4858-809d-45817e943aef req-710f3a6d-058d-4097-8c0b-99632a8e6756 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-changed-c2abe47d-718c-4e51-b661-823b9fa7add9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:34:31 np0005538960 nova_compute[187252]: 2025-11-28 16:34:31.655 187256 DEBUG nova.compute.manager [req-1eaee403-36c4-4858-809d-45817e943aef req-710f3a6d-058d-4097-8c0b-99632a8e6756 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Refreshing instance network info cache due to event network-changed-c2abe47d-718c-4e51-b661-823b9fa7add9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:34:31 np0005538960 nova_compute[187252]: 2025-11-28 16:34:31.655 187256 DEBUG oslo_concurrency.lockutils [req-1eaee403-36c4-4858-809d-45817e943aef req-710f3a6d-058d-4097-8c0b-99632a8e6756 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:34:31 np0005538960 podman[223720]: 2025-11-28 16:34:31.878753567 +0000 UTC m=+0.062220271 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, vcs-type=git, distribution-scope=public, managed_by=edpm_ansible)
Nov 28 11:34:32 np0005538960 nova_compute[187252]: 2025-11-28 16:34:32.739 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:33Z|00229|binding|INFO|Releasing lport 2c360ecb-6340-4b7d-b161-50246b8a26c6 from this chassis (sb_readonly=0)
Nov 28 11:34:33 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:33Z|00230|binding|INFO|Releasing lport 4251436a-ae80-429d-9ed3-db2fe2ff59d6 from this chassis (sb_readonly=0)
Nov 28 11:34:33 np0005538960 nova_compute[187252]: 2025-11-28 16:34:33.064 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:33 np0005538960 nova_compute[187252]: 2025-11-28 16:34:33.694 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.316 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b890bd95-f884-4215-91f2-749834092bc1', 'name': 'tempest-TestGettingAddress-server-894824609', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000030', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b5f802fe6e0b4d62bba6143515207a40', 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'hostId': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.318 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.318 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.319 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-894824609>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-894824609>]
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.319 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.337 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/cpu volume: 12120000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '48dc8f38-e569-4590-9646-00bef0044dca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 12120000000, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'timestamp': '2025-11-28T16:34:35.319800', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '1ffe579e-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.971909586, 'message_signature': '51ab73eb728bb1f3945320a99e66eba5a9502070cc2754ec3460d87cb6e2f24b'}]}, 'timestamp': '2025-11-28 16:34:35.339222', '_unique_id': 'f0ae332d7b214325a7ea78fe6701017c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.340 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.342 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.344 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b890bd95-f884-4215-91f2-749834092bc1 / tapc2abe47d-71 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.345 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b890bd95-f884-4215-91f2-749834092bc1 / tapab0766c8-c7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.345 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.346 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2d5636fb-1f62-45a4-8c70-1c976753f76b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.342304', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '1fff6ddc-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '0f913a73dbb005627ccea1c7542f64ef4edefbf7a44f5386ec8c64d51c409fe8'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.342304', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '1fff8218-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '0b15a526604b5b0af71b0e88007dfde799fb74b46e6761f4fbf443d1b72ac973'}]}, 'timestamp': '2025-11-28 16:34:35.346757', '_unique_id': '79fb8140e4cc4fc6862cada4d6b4de43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.347 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.348 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.379 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.read.bytes volume: 25349632 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.379 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.read.bytes volume: 55474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37c07f65-a070-4052-b3fa-6fa77477d7bd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 25349632, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.349025', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '200486fa-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': '07eb9c58b6acb389af1b51af4c8668970d3780730f25a20350de4cdc1733b599'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 55474, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.349025', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2004953c-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': '873601878c197eb4a25a73fe8ae41a330690818cd04b533aab95b77a91ac9cc2'}]}, 'timestamp': '2025-11-28 16:34:35.380014', '_unique_id': '70737a8858044312a86c3a5ec1e9f564'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.381 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.382 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.382 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.read.requests volume: 838 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.383 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.read.requests volume: 20 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c1b1524-2d8d-4d8c-892d-0c5232a94d1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 838, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.382632', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '20050aa8-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': 'b85b66673349bcc0290c3c5dffa33afda4b029fb2ca75faab1cdf3dab0c18664'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 20, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.382632', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '200518cc-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': 'd7409471e16bbe4650702fad7d2ccb4a201db2ac4e044a86e20311f55c4fdd1a'}]}, 'timestamp': '2025-11-28 16:34:35.383358', '_unique_id': '450b7632c7ed4d2fb9837acb836b97f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.384 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.385 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.385 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.385 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1873be2e-766f-46cf-8472-c1f64595244a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.385346', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '200574de-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': 'c1c7328dcd965fe040646e0543cf85d0cceae2a6c8b3c822ee70d85d4487b8ab'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.385346', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '20058384-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '968a4396f3620a19123b50ed2a639856cfe3448c969c3ffdd7939413dfba3a43'}]}, 'timestamp': '2025-11-28 16:34:35.386103', '_unique_id': '7868c7376c0a4c508d07fcdef9a499f5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.386 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.387 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.388 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.write.latency volume: 41215310591 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.388 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0b10a84-4764-4ee8-b6a2-c95244eb761e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 41215310591, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.388047', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2005dde8-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': '149ff5ced7000dba01592d45f2d77681995721a555bcff474038d9b8b5873bcc'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.388047', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2005ead6-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': 'ce4daf9f769b62fafa8aecccb4408eb9b3518dc7362161f87be258f02e40e779'}]}, 'timestamp': '2025-11-28 16:34:35.388735', '_unique_id': 'd7455f76868b45cfbb884f454954d173'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.389 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.390 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.390 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.391 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-894824609>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-894824609>]
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.391 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.391 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/memory.usage volume: 40.40625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dc06ce13-7f93-450a-879b-24bcc996f440', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.40625, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'timestamp': '2025-11-28T16:34:35.391471', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '20066646-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.971909586, 'message_signature': 'bb4f57e408ce728f5679dbd2d271046c6dd13b3a7e3ce63f33647f01c11d5a78'}]}, 'timestamp': '2025-11-28 16:34:35.391924', '_unique_id': '677d6cf3e2294765aef18efb21601efa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.392 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.393 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.393 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.394 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b5d5fd1-9b30-46bd-ad8b-8516d1125dbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.393949', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '2006c4e2-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '1c9126dd1988d0161f127c8868fccb9a568b60fd53a85e2261c392216c66eefe'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.393949', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '2006d266-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '5e57159b3dc1cf7b526aec8a470713863d0dc1741846653ea6a751b8be8b2fbb'}]}, 'timestamp': '2025-11-28 16:34:35.394674', '_unique_id': '088bfdbbd4f04004ae53695ba4275910'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.395 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.396 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.396 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.396 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-894824609>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-894824609>]
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.397 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.397 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.397 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6fd9dd1-7fa6-4da4-afc4-43ca58685892', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.397222', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '2007446c-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '98253c004bb66407acdab0db9b6370e71d36887232837a0008a184cf1eba8f5a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.397222', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '200751e6-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': 'f6ecb3d5a81785d5fda00d654e35b24174ad39af3e8c511f51229e7f793a451d'}]}, 'timestamp': '2025-11-28 16:34:35.397970', '_unique_id': '9bb2ffac438a47aa9fa2a06920f274da'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.398 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.399 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.399 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.write.bytes volume: 24518656 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.400 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b6088f89-9c73-4a5b-b6c5-eacbc9abf74f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 24518656, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.399882', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '2007ad76-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': '8610817172af536f254d451ce93cc35523329091457f8fca67da0e8ae7efa7c6'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.399882', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '2007ba78-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': 'ba5c553db71ca93e01561175a588f12502a912fff832bfc8a97157e419174674'}]}, 'timestamp': '2025-11-28 16:34:35.400606', '_unique_id': '7334a6f0f4164c498c039ccba54226d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.401 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.402 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.402 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.402 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ad8e85f2-979b-4788-9949-f48b7a22676d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.402556', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '200814aa-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '4a0657539c3533780feca2781ae5668f93fe56ba2691db5df1d6bba0e0e5b6ff'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.402556', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '20082350-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '5d90eeb7ecd45d9e229d5b90ba9bd9011e778ce996ad74a94b529c1affb7e393'}]}, 'timestamp': '2025-11-28 16:34:35.403302', '_unique_id': '74addaa4e8bc4c10bcf03abe1d2b70c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.405 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.405 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-894824609>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-894824609>]
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.405 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.420 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.usage volume: 27262976 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.420 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b12a5a94-8d83-4f31-958e-3041f0f9bbf7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 27262976, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.405802', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '200accae-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4775.040030358, 'message_signature': 'b1a7a178334050d3a8481c1651dd66f13f442de40316623db9745b547e8513c2'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.405802', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '200ada14-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4775.040030358, 'message_signature': '6db50046fe549d3bff8a7af3e29a88941264598f9030f6b367a13f79e401ecf5'}]}, 'timestamp': '2025-11-28 16:34:35.421054', '_unique_id': '72305431ad1a4e70aa24ef05342c26b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.421 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.422 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.423 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.allocation volume: 28057600 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.423 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a48c2c42-133f-4a1b-908f-06ff4fefdb9e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 28057600, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.423126', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '200b381a-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4775.040030358, 'message_signature': 'bfe6588872201e738f876c7d9f9a95b05ce13b301a2eda1fa1638d919676ee62'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.423126', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '200b44fe-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4775.040030358, 'message_signature': '3152f6c9ae88d369e7abcf44c6133ce2bd28b871983b2b9b447fc87d08cf46fc'}]}, 'timestamp': '2025-11-28 16:34:35.423810', '_unique_id': 'cc2a5ba511bf492ca8fda2f45e4025f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.424 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.425 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.426 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.426 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '71acad9a-6dfe-4198-9c41-a1833233bcdf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.426070', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '200bab7e-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '04da79f6c61fa959a172eba7a3c1d5f737a4fce6509d8472e8d7953529f7644c'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.426070', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '200bb9f2-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '4ff69d06f51439e1f5e7c125fa7a3cb76baec2de8b740643d136d2519a72f03c'}]}, 'timestamp': '2025-11-28 16:34:35.426814', '_unique_id': '3edcd55d2b6f46c8943a122529d81a9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.427 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.428 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.428 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.write.requests volume: 208 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.429 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6edc81f1-dfa4-4513-a8a2-29ee0f3d6be7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 208, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.428747', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '200c14b0-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': '9a5b878f63a74b4366b26a6b4cccfe470bc0b950e9b7312d0448a74d3e8ff2aa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.428747', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '200c2252-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': '4322ee7bc0ac14e6862172d4d086d1e8d64820700d1fdd1399d0e3712157a214'}]}, 'timestamp': '2025-11-28 16:34:35.429477', '_unique_id': '579ec14f4505420b9314131dbd4b619d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.430 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.431 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.431 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.431 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10001e38-605c-4221-bc72-6f272d027c6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.431398', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '200c7b1c-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4775.040030358, 'message_signature': '475323322d79e5389973ee51d4fe52cb12fd34b5c48f9a9ed182c9afa74e8848'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.431398', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '200c8918-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4775.040030358, 'message_signature': 'a3b04955c7d72177e248ce53889af8297a362971e45102ad3f317958d4c0906e'}]}, 'timestamp': '2025-11-28 16:34:35.432107', '_unique_id': '7d980f728de7453c81065d006ca1e9b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.432 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.433 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.434 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.434 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f8cc0d76-531b-47ae-8c45-5dca4f4ec661', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.434033', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '200ce246-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '965fb7eb56a60b9ed6af246680247f6ccaf737fce569385a0a4e2593decb89bd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.434033', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '200cefac-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '08c8d26b5074ce6b2f19a579bbce0e1c89ec4b3834565c6c6c7cda88b3bad062'}]}, 'timestamp': '2025-11-28 16:34:35.434745', '_unique_id': '0914b4c3852747d7bc8e1878f9ced484'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.435 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.436 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.436 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.read.latency volume: 199843739 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.437 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/disk.device.read.latency volume: 3067042 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2cdaa9b5-cdb3-47c1-8413-21c2c4681c79', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 199843739, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-vda', 'timestamp': '2025-11-28T16:34:35.436657', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '200d4876-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': '3870d2aa9de8b6599be31301cc840f853d3e31d59370446738d7361c47d2d861'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3067042, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'b890bd95-f884-4215-91f2-749834092bc1-sda', 'timestamp': '2025-11-28T16:34:35.436657', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'instance-00000030', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '200d5762-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.983195358, 'message_signature': 'f029947dc1427279c31835e82d7498b21c66b0f694657f5eff103037b6ce79e8'}]}, 'timestamp': '2025-11-28 16:34:35.437389', '_unique_id': '9d46efe497a441b59c5bc86fbe3df493'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.438 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.439 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.440 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.440 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '393ec83a-bb5c-46c8-8449-d85c0298fcbe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.439969', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '200dd318-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '43cbce83d77ba5e5287ea18575d1dc98f2963d527f18cb9ff07a7e8cdb935bfa'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.439969', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '200dea24-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': 'e18858c988b421bad0ec54b62879bddcff292393c80b567b986531c6b755d7fd'}]}, 'timestamp': '2025-11-28 16:34:35.441261', '_unique_id': 'f1d994341f6f47f48ead39103921011f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.443 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.444 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.444 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.444 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f770710-0c8c-4ebe-925f-9a58b1b8f6f9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.444321', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '200e7318-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '065f54c8fcd1438e6fca1eec3ac4043876cfed904435a8d00659bc328ebf79e4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.444321', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '200e806a-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': 'f8741190b366b7cf279f00daf3f3b1a19133a78fab4108fc25fe13b77b0aeb8b'}]}, 'timestamp': '2025-11-28 16:34:35.445039', '_unique_id': '0dc97da8cb704802b5b204570cf70822'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.445 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.446 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.446 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.447 12 DEBUG ceilometer.compute.pollsters [-] b890bd95-f884-4215-91f2-749834092bc1/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fa634e4c-c601-45e9-9acb-5568b233e3d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapc2abe47d-71', 'timestamp': '2025-11-28T16:34:35.446766', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapc2abe47d-71', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:f4:04', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc2abe47d-71'}, 'message_id': '200ed3bc-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '0b15c9a29a37434ffd44bb094c61ed1bc8c06f6546b3fc97384852840a3d832b'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000030-b890bd95-f884-4215-91f2-749834092bc1-tapab0766c8-c7', 'timestamp': '2025-11-28T16:34:35.446766', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-894824609', 'name': 'tapab0766c8-c7', 'instance_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:3f:d1:2c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapab0766c8-c7'}, 'message_id': '200ee0dc-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 4774.976498237, 'message_signature': '73009af174486f8bbd7400a481da84cf3c43ada8fbdcbae99088eae9862ad6f5'}]}, 'timestamp': '2025-11-28 16:34:35.447480', '_unique_id': '076b60eb3b52449383420ebe4c1e60a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:34:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:34:35.448 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.023 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updating instance_info_cache with network_info: [{"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.046 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.047 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.047 187256 DEBUG oslo_concurrency.lockutils [req-1eaee403-36c4-4858-809d-45817e943aef req-710f3a6d-058d-4097-8c0b-99632a8e6756 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.048 187256 DEBUG nova.network.neutron [req-1eaee403-36c4-4858-809d-45817e943aef req-710f3a6d-058d-4097-8c0b-99632a8e6756 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Refreshing network info cache for port c2abe47d-718c-4e51-b661-823b9fa7add9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.049 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.050 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.050 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.050 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.083 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.083 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.084 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.084 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.202 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.267 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.269 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.337 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.467 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.469 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5543MB free_disk=73.31010055541992GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.469 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.470 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.558 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance b890bd95-f884-4215-91f2-749834092bc1 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.559 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.559 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.582 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.623 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.624 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.658 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.684 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.753 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.771 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.796 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:34:36 np0005538960 nova_compute[187252]: 2025-11-28 16:34:36.797 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.327s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:34:36 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:36Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:24:f4:04 10.100.0.12
Nov 28 11:34:36 np0005538960 ovn_controller[95460]: 2025-11-28T16:34:36Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:24:f4:04 10.100.0.12
Nov 28 11:34:37 np0005538960 nova_compute[187252]: 2025-11-28 16:34:37.742 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:37 np0005538960 nova_compute[187252]: 2025-11-28 16:34:37.789 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:34:38 np0005538960 nova_compute[187252]: 2025-11-28 16:34:38.697 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:39 np0005538960 podman[223763]: 2025-11-28 16:34:39.158565373 +0000 UTC m=+0.064281510 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:34:39 np0005538960 nova_compute[187252]: 2025-11-28 16:34:39.746 187256 DEBUG nova.network.neutron [req-1eaee403-36c4-4858-809d-45817e943aef req-710f3a6d-058d-4097-8c0b-99632a8e6756 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updated VIF entry in instance network info cache for port c2abe47d-718c-4e51-b661-823b9fa7add9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:34:39 np0005538960 nova_compute[187252]: 2025-11-28 16:34:39.747 187256 DEBUG nova.network.neutron [req-1eaee403-36c4-4858-809d-45817e943aef req-710f3a6d-058d-4097-8c0b-99632a8e6756 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updating instance_info_cache with network_info: [{"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:34:39 np0005538960 nova_compute[187252]: 2025-11-28 16:34:39.788 187256 DEBUG oslo_concurrency.lockutils [req-1eaee403-36c4-4858-809d-45817e943aef req-710f3a6d-058d-4097-8c0b-99632a8e6756 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:34:42 np0005538960 nova_compute[187252]: 2025-11-28 16:34:42.743 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:43 np0005538960 nova_compute[187252]: 2025-11-28 16:34:43.698 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:44 np0005538960 podman[223783]: 2025-11-28 16:34:44.173963515 +0000 UTC m=+0.079242250 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:34:47 np0005538960 nova_compute[187252]: 2025-11-28 16:34:47.746 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:48 np0005538960 nova_compute[187252]: 2025-11-28 16:34:48.701 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:50 np0005538960 podman[223806]: 2025-11-28 16:34:50.201704106 +0000 UTC m=+0.097907631 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:34:52 np0005538960 nova_compute[187252]: 2025-11-28 16:34:52.748 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:53 np0005538960 podman[223832]: 2025-11-28 16:34:53.165308444 +0000 UTC m=+0.065151471 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:34:53 np0005538960 podman[223833]: 2025-11-28 16:34:53.165543189 +0000 UTC m=+0.062707381 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent)
Nov 28 11:34:53 np0005538960 nova_compute[187252]: 2025-11-28 16:34:53.704 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:57 np0005538960 nova_compute[187252]: 2025-11-28 16:34:57.750 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:58 np0005538960 nova_compute[187252]: 2025-11-28 16:34:58.706 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:34:59 np0005538960 podman[223869]: 2025-11-28 16:34:59.15302653 +0000 UTC m=+0.056515624 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:35:02 np0005538960 podman[223893]: 2025-11-28 16:35:02.157641167 +0000 UTC m=+0.055670173 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Nov 28 11:35:02 np0005538960 nova_compute[187252]: 2025-11-28 16:35:02.752 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:03 np0005538960 nova_compute[187252]: 2025-11-28 16:35:03.708 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:06.355 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:06.355 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:06.356 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:07 np0005538960 nova_compute[187252]: 2025-11-28 16:35:07.754 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:08 np0005538960 nova_compute[187252]: 2025-11-28 16:35:08.711 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:10 np0005538960 podman[223919]: 2025-11-28 16:35:10.161361232 +0000 UTC m=+0.062943428 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:35:12 np0005538960 nova_compute[187252]: 2025-11-28 16:35:12.756 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:13 np0005538960 nova_compute[187252]: 2025-11-28 16:35:13.713 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:15 np0005538960 podman[223939]: 2025-11-28 16:35:15.148283519 +0000 UTC m=+0.054092085 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:35:17 np0005538960 nova_compute[187252]: 2025-11-28 16:35:17.758 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:18 np0005538960 nova_compute[187252]: 2025-11-28 16:35:18.715 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:21 np0005538960 podman[223964]: 2025-11-28 16:35:21.20050635 +0000 UTC m=+0.098457334 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:35:22 np0005538960 nova_compute[187252]: 2025-11-28 16:35:22.760 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:23 np0005538960 nova_compute[187252]: 2025-11-28 16:35:23.718 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:24 np0005538960 podman[223993]: 2025-11-28 16:35:24.156065565 +0000 UTC m=+0.059244088 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:35:24 np0005538960 podman[223992]: 2025-11-28 16:35:24.161024005 +0000 UTC m=+0.065590432 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 11:35:25 np0005538960 nova_compute[187252]: 2025-11-28 16:35:25.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:26 np0005538960 nova_compute[187252]: 2025-11-28 16:35:26.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:27 np0005538960 nova_compute[187252]: 2025-11-28 16:35:27.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:27 np0005538960 nova_compute[187252]: 2025-11-28 16:35:27.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:35:27 np0005538960 nova_compute[187252]: 2025-11-28 16:35:27.800 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:27 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:27.865 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:35:27 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:27.866 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:35:27 np0005538960 nova_compute[187252]: 2025-11-28 16:35:27.866 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:28 np0005538960 nova_compute[187252]: 2025-11-28 16:35:28.720 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:29 np0005538960 nova_compute[187252]: 2025-11-28 16:35:29.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:29 np0005538960 nova_compute[187252]: 2025-11-28 16:35:29.317 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:35:29 np0005538960 nova_compute[187252]: 2025-11-28 16:35:29.317 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:35:29 np0005538960 nova_compute[187252]: 2025-11-28 16:35:29.598 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:35:29 np0005538960 nova_compute[187252]: 2025-11-28 16:35:29.599 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:35:29 np0005538960 nova_compute[187252]: 2025-11-28 16:35:29.599 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:35:29 np0005538960 nova_compute[187252]: 2025-11-28 16:35:29.599 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b890bd95-f884-4215-91f2-749834092bc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:35:30 np0005538960 podman[224033]: 2025-11-28 16:35:30.151230522 +0000 UTC m=+0.057072757 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.425 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updating instance_info_cache with network_info: [{"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.443 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.443 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.444 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.444 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.790 187256 DEBUG nova.compute.manager [req-84fac40d-6701-40f2-8687-edaedd4faf25 req-2d7615cb-4054-4a43-aa43-3cc3c288131b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-changed-c2abe47d-718c-4e51-b661-823b9fa7add9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.791 187256 DEBUG nova.compute.manager [req-84fac40d-6701-40f2-8687-edaedd4faf25 req-2d7615cb-4054-4a43-aa43-3cc3c288131b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Refreshing instance network info cache due to event network-changed-c2abe47d-718c-4e51-b661-823b9fa7add9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.791 187256 DEBUG oslo_concurrency.lockutils [req-84fac40d-6701-40f2-8687-edaedd4faf25 req-2d7615cb-4054-4a43-aa43-3cc3c288131b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.791 187256 DEBUG oslo_concurrency.lockutils [req-84fac40d-6701-40f2-8687-edaedd4faf25 req-2d7615cb-4054-4a43-aa43-3cc3c288131b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.792 187256 DEBUG nova.network.neutron [req-84fac40d-6701-40f2-8687-edaedd4faf25 req-2d7615cb-4054-4a43-aa43-3cc3c288131b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Refreshing network info cache for port c2abe47d-718c-4e51-b661-823b9fa7add9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.801 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.872 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.873 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.873 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.873 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.874 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.875 187256 INFO nova.compute.manager [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Terminating instance#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.876 187256 DEBUG nova.compute.manager [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:35:32 np0005538960 kernel: tapc2abe47d-71 (unregistering): left promiscuous mode
Nov 28 11:35:32 np0005538960 NetworkManager[55548]: <info>  [1764347732.9000] device (tapc2abe47d-71): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.906 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:32 np0005538960 ovn_controller[95460]: 2025-11-28T16:35:32Z|00231|binding|INFO|Releasing lport c2abe47d-718c-4e51-b661-823b9fa7add9 from this chassis (sb_readonly=0)
Nov 28 11:35:32 np0005538960 ovn_controller[95460]: 2025-11-28T16:35:32Z|00232|binding|INFO|Setting lport c2abe47d-718c-4e51-b661-823b9fa7add9 down in Southbound
Nov 28 11:35:32 np0005538960 ovn_controller[95460]: 2025-11-28T16:35:32Z|00233|binding|INFO|Removing iface tapc2abe47d-71 ovn-installed in OVS
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.909 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:32.915 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:f4:04 10.100.0.12'], port_security=['fa:16:3e:24:f4:04 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f167768e-3551-41c3-a3de-da1c1ed19a2c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'efa51696-1cad-4945-b794-623719fa4d3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9ea0ae4f-79c3-4737-b920-c386035d0846, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=c2abe47d-718c-4e51-b661-823b9fa7add9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:35:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:32.917 104369 INFO neutron.agent.ovn.metadata.agent [-] Port c2abe47d-718c-4e51-b661-823b9fa7add9 in datapath f167768e-3551-41c3-a3de-da1c1ed19a2c unbound from our chassis#033[00m
Nov 28 11:35:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:32.920 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f167768e-3551-41c3-a3de-da1c1ed19a2c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:35:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:32.923 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[eed34901-6954-4ba8-88e8-a2b15247613d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:32.924 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c namespace which is not needed anymore#033[00m
Nov 28 11:35:32 np0005538960 kernel: tapab0766c8-c7 (unregistering): left promiscuous mode
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.927 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:32 np0005538960 NetworkManager[55548]: <info>  [1764347732.9316] device (tapab0766c8-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.935 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.948 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:32 np0005538960 ovn_controller[95460]: 2025-11-28T16:35:32Z|00234|binding|INFO|Releasing lport ab0766c8-c7bc-4af1-9cc0-971475b014b7 from this chassis (sb_readonly=0)
Nov 28 11:35:32 np0005538960 ovn_controller[95460]: 2025-11-28T16:35:32Z|00235|binding|INFO|Setting lport ab0766c8-c7bc-4af1-9cc0-971475b014b7 down in Southbound
Nov 28 11:35:32 np0005538960 ovn_controller[95460]: 2025-11-28T16:35:32Z|00236|binding|INFO|Removing iface tapab0766c8-c7 ovn-installed in OVS
Nov 28 11:35:32 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:32.957 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:d1:2c 2001:db8::f816:3eff:fe3f:d12c'], port_security=['fa:16:3e:3f:d1:2c 2001:db8::f816:3eff:fe3f:d12c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe3f:d12c/64', 'neutron:device_id': 'b890bd95-f884-4215-91f2-749834092bc1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'efa51696-1cad-4945-b794-623719fa4d3c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30aaa9d3-aada-4fac-a64f-fec159b96017, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=ab0766c8-c7bc-4af1-9cc0-971475b014b7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:35:32 np0005538960 nova_compute[187252]: 2025-11-28 16:35:32.967 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:32 np0005538960 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000030.scope: Deactivated successfully.
Nov 28 11:35:32 np0005538960 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000030.scope: Consumed 17.121s CPU time.
Nov 28 11:35:32 np0005538960 systemd-machined[153518]: Machine qemu-18-instance-00000030 terminated.
Nov 28 11:35:33 np0005538960 podman[224059]: 2025-11-28 16:35:33.015812184 +0000 UTC m=+0.085218085 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=edpm, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Nov 28 11:35:33 np0005538960 NetworkManager[55548]: <info>  [1764347733.1140] manager: (tapab0766c8-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Nov 28 11:35:33 np0005538960 neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c[223550]: [NOTICE]   (223567) : haproxy version is 2.8.14-c23fe91
Nov 28 11:35:33 np0005538960 neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c[223550]: [NOTICE]   (223567) : path to executable is /usr/sbin/haproxy
Nov 28 11:35:33 np0005538960 neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c[223550]: [WARNING]  (223567) : Exiting Master process...
Nov 28 11:35:33 np0005538960 neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c[223550]: [WARNING]  (223567) : Exiting Master process...
Nov 28 11:35:33 np0005538960 neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c[223550]: [ALERT]    (223567) : Current worker (223569) exited with code 143 (Terminated)
Nov 28 11:35:33 np0005538960 neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c[223550]: [WARNING]  (223567) : All workers exited. Exiting... (0)
Nov 28 11:35:33 np0005538960 systemd[1]: libpod-19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621.scope: Deactivated successfully.
Nov 28 11:35:33 np0005538960 podman[224108]: 2025-11-28 16:35:33.15879333 +0000 UTC m=+0.136213673 container died 19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.159 187256 INFO nova.virt.libvirt.driver [-] [instance: b890bd95-f884-4215-91f2-749834092bc1] Instance destroyed successfully.#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.160 187256 DEBUG nova.objects.instance [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'resources' on Instance uuid b890bd95-f884-4215-91f2-749834092bc1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.178 187256 DEBUG nova.virt.libvirt.vif [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-894824609',display_name='tempest-TestGettingAddress-server-894824609',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-894824609',id=48,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFktOrX6gF2AzWhwO+nGvoS57OBSP3laYDHq62fCTtGv8c+DoaFIxSAOh6oCv+DiEK35kK0uU+oYfnbqBnIodHTIIADd7iRQsOEXhskcFfG472xjkhS/wB+Vvdgt7W/5jg==',key_name='tempest-TestGettingAddress-1253484732',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:34:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-eejrb9bi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:34:20Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=b890bd95-f884-4215-91f2-749834092bc1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.179 187256 DEBUG nova.network.os_vif_util [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.211", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.179 187256 DEBUG nova.network.os_vif_util [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:f4:04,bridge_name='br-int',has_traffic_filtering=True,id=c2abe47d-718c-4e51-b661-823b9fa7add9,network=Network(f167768e-3551-41c3-a3de-da1c1ed19a2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2abe47d-71') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.180 187256 DEBUG os_vif [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:f4:04,bridge_name='br-int',has_traffic_filtering=True,id=c2abe47d-718c-4e51-b661-823b9fa7add9,network=Network(f167768e-3551-41c3-a3de-da1c1ed19a2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2abe47d-71') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.181 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.182 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2abe47d-71, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.183 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.186 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.189 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.191 187256 INFO os_vif [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:f4:04,bridge_name='br-int',has_traffic_filtering=True,id=c2abe47d-718c-4e51-b661-823b9fa7add9,network=Network(f167768e-3551-41c3-a3de-da1c1ed19a2c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2abe47d-71')#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.192 187256 DEBUG nova.virt.libvirt.vif [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:33:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-894824609',display_name='tempest-TestGettingAddress-server-894824609',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-894824609',id=48,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFktOrX6gF2AzWhwO+nGvoS57OBSP3laYDHq62fCTtGv8c+DoaFIxSAOh6oCv+DiEK35kK0uU+oYfnbqBnIodHTIIADd7iRQsOEXhskcFfG472xjkhS/wB+Vvdgt7W/5jg==',key_name='tempest-TestGettingAddress-1253484732',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:34:20Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-eejrb9bi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:34:20Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=b890bd95-f884-4215-91f2-749834092bc1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.192 187256 DEBUG nova.network.os_vif_util [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.193 187256 DEBUG nova.network.os_vif_util [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3f:d1:2c,bridge_name='br-int',has_traffic_filtering=True,id=ab0766c8-c7bc-4af1-9cc0-971475b014b7,network=Network(c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0766c8-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.193 187256 DEBUG os_vif [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:d1:2c,bridge_name='br-int',has_traffic_filtering=True,id=ab0766c8-c7bc-4af1-9cc0-971475b014b7,network=Network(c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0766c8-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.195 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.195 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab0766c8-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.197 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.198 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.200 187256 INFO os_vif [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3f:d1:2c,bridge_name='br-int',has_traffic_filtering=True,id=ab0766c8-c7bc-4af1-9cc0-971475b014b7,network=Network(c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapab0766c8-c7')#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.201 187256 INFO nova.virt.libvirt.driver [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Deleting instance files /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1_del#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.201 187256 INFO nova.virt.libvirt.driver [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Deletion of /var/lib/nova/instances/b890bd95-f884-4215-91f2-749834092bc1_del complete#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.247 187256 INFO nova.compute.manager [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.248 187256 DEBUG oslo.service.loopingcall [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.248 187256 DEBUG nova.compute.manager [-] [instance: b890bd95-f884-4215-91f2-749834092bc1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.248 187256 DEBUG nova.network.neutron [-] [instance: b890bd95-f884-4215-91f2-749834092bc1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.345 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.346 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.346 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.346 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:35:33 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621-userdata-shm.mount: Deactivated successfully.
Nov 28 11:35:33 np0005538960 systemd[1]: var-lib-containers-storage-overlay-8fc1b0a6ecd058275e21f89c7b965aef03667e29e48522afc9f914e9830427cc-merged.mount: Deactivated successfully.
Nov 28 11:35:33 np0005538960 podman[224108]: 2025-11-28 16:35:33.487678577 +0000 UTC m=+0.465098920 container cleanup 19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:35:33 np0005538960 systemd[1]: libpod-conmon-19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621.scope: Deactivated successfully.
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.566 187256 DEBUG nova.compute.manager [req-1a62fc2d-c129-4112-92bf-d6ba7ef97d19 req-c223e052-74db-42dc-8c4a-12526dc9f0a5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-unplugged-c2abe47d-718c-4e51-b661-823b9fa7add9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.567 187256 DEBUG oslo_concurrency.lockutils [req-1a62fc2d-c129-4112-92bf-d6ba7ef97d19 req-c223e052-74db-42dc-8c4a-12526dc9f0a5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.567 187256 DEBUG oslo_concurrency.lockutils [req-1a62fc2d-c129-4112-92bf-d6ba7ef97d19 req-c223e052-74db-42dc-8c4a-12526dc9f0a5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.567 187256 DEBUG oslo_concurrency.lockutils [req-1a62fc2d-c129-4112-92bf-d6ba7ef97d19 req-c223e052-74db-42dc-8c4a-12526dc9f0a5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.567 187256 DEBUG nova.compute.manager [req-1a62fc2d-c129-4112-92bf-d6ba7ef97d19 req-c223e052-74db-42dc-8c4a-12526dc9f0a5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] No waiting events found dispatching network-vif-unplugged-c2abe47d-718c-4e51-b661-823b9fa7add9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.568 187256 DEBUG nova.compute.manager [req-1a62fc2d-c129-4112-92bf-d6ba7ef97d19 req-c223e052-74db-42dc-8c4a-12526dc9f0a5 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-unplugged-c2abe47d-718c-4e51-b661-823b9fa7add9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.583 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.584 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5698MB free_disk=73.33781433105469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.585 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.585 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.671 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance b890bd95-f884-4215-91f2-749834092bc1 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.672 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.672 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.725 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.733 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.746 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.776 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.776 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:33 np0005538960 podman[224161]: 2025-11-28 16:35:33.816884722 +0000 UTC m=+0.302071792 container remove 19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.823 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e6d1bf-9086-46c2-850d-3d7978d007c5]: (4, ('Fri Nov 28 04:35:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c (19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621)\n19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621\nFri Nov 28 04:35:33 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c (19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621)\n19a5471747a15aaf5e7d0a41c4a6a77b22b774c6b4d4f0bb2c4d149cb091f621\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.826 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[efd80238-9468-4958-9f7e-6a4bf76c28e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.827 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf167768e-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:35:33 np0005538960 kernel: tapf167768e-30: left promiscuous mode
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.829 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 nova_compute[187252]: 2025-11-28 16:35:33.842 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.846 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[fc589988-62f1-455d-9323-e19a870b50f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.863 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb56ec0-24f4-40c6-a3c4-7a79938d08c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.865 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[e25122a7-d14e-4c44-81b5-8d837737cf5e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.889 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[9068e4ce-2999-487e-94b7-d22e0cd9bf0c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475829, 'reachable_time': 32728, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224174, 'error': None, 'target': 'ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:33 np0005538960 systemd[1]: run-netns-ovnmeta\x2df167768e\x2d3551\x2d41c3\x2da3de\x2dda1c1ed19a2c.mount: Deactivated successfully.
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.895 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f167768e-3551-41c3-a3de-da1c1ed19a2c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.896 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[32cb0577-ee08-4285-a913-c2af4fe089f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.898 104369 INFO neutron.agent.ovn.metadata.agent [-] Port ab0766c8-c7bc-4af1-9cc0-971475b014b7 in datapath c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da unbound from our chassis#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.899 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.901 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3c7de67f-7103-47dc-8e46-129d446d3c04]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:33.901 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da namespace which is not needed anymore#033[00m
Nov 28 11:35:34 np0005538960 neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da[223637]: [NOTICE]   (223641) : haproxy version is 2.8.14-c23fe91
Nov 28 11:35:34 np0005538960 neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da[223637]: [NOTICE]   (223641) : path to executable is /usr/sbin/haproxy
Nov 28 11:35:34 np0005538960 neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da[223637]: [WARNING]  (223641) : Exiting Master process...
Nov 28 11:35:34 np0005538960 neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da[223637]: [WARNING]  (223641) : Exiting Master process...
Nov 28 11:35:34 np0005538960 neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da[223637]: [ALERT]    (223641) : Current worker (223643) exited with code 143 (Terminated)
Nov 28 11:35:34 np0005538960 neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da[223637]: [WARNING]  (223641) : All workers exited. Exiting... (0)
Nov 28 11:35:34 np0005538960 systemd[1]: libpod-9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9.scope: Deactivated successfully.
Nov 28 11:35:34 np0005538960 podman[224191]: 2025-11-28 16:35:34.039426175 +0000 UTC m=+0.047140107 container died 9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Nov 28 11:35:34 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9-userdata-shm.mount: Deactivated successfully.
Nov 28 11:35:34 np0005538960 systemd[1]: var-lib-containers-storage-overlay-6b2b0cb4615a1e6653117f75d19d2f2e9a7dcd96534e9bfd64744e613eb081b0-merged.mount: Deactivated successfully.
Nov 28 11:35:34 np0005538960 podman[224191]: 2025-11-28 16:35:34.078277112 +0000 UTC m=+0.085991044 container cleanup 9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 11:35:34 np0005538960 systemd[1]: libpod-conmon-9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9.scope: Deactivated successfully.
Nov 28 11:35:34 np0005538960 podman[224221]: 2025-11-28 16:35:34.771534501 +0000 UTC m=+0.671638219 container remove 9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.779 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[61f17555-bbc1-40c4-a9a0-081b00d2178f]: (4, ('Fri Nov 28 04:35:33 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da (9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9)\n9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9\nFri Nov 28 04:35:34 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da (9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9)\n9044e8514d7f47d146a7d5954223f8ae52d0b4a479939b57e41ef2b8e8a12fa9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.782 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[0539814f-179b-498b-9580-74eb65fe94a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.784 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc053bf8e-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:35:34 np0005538960 nova_compute[187252]: 2025-11-28 16:35:34.786 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:34 np0005538960 kernel: tapc053bf8e-50: left promiscuous mode
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.793 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[14c7dac5-51cb-4ff2-8299-d6c9d20a459b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:34 np0005538960 nova_compute[187252]: 2025-11-28 16:35:34.801 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.817 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[5a9c23db-157d-4f25-b79d-402e5640f882]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.820 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a3166158-6f39-4222-89ad-47367f098515]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.839 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[01dcd460-e92d-41c5-9150-c939186c4384]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 475960, 'reachable_time': 17465, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224235, 'error': None, 'target': 'ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:34 np0005538960 systemd[1]: run-netns-ovnmeta\x2dc053bf8e\x2d59fa\x2d4bf5\x2db9f4\x2d2d0fb36f69da.mount: Deactivated successfully.
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.843 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.843 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[e95f1a52-37b1-4666-a564-081e0d2f4f57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:35:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:35:34.868 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:35:34 np0005538960 nova_compute[187252]: 2025-11-28 16:35:34.921 187256 DEBUG nova.compute.manager [req-e0031e4b-befa-4490-8633-4e2c0f4c877b req-c9086433-f2d0-4378-96d2-aafb7e2db931 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-unplugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:35:34 np0005538960 nova_compute[187252]: 2025-11-28 16:35:34.922 187256 DEBUG oslo_concurrency.lockutils [req-e0031e4b-befa-4490-8633-4e2c0f4c877b req-c9086433-f2d0-4378-96d2-aafb7e2db931 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:34 np0005538960 nova_compute[187252]: 2025-11-28 16:35:34.923 187256 DEBUG oslo_concurrency.lockutils [req-e0031e4b-befa-4490-8633-4e2c0f4c877b req-c9086433-f2d0-4378-96d2-aafb7e2db931 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:34 np0005538960 nova_compute[187252]: 2025-11-28 16:35:34.923 187256 DEBUG oslo_concurrency.lockutils [req-e0031e4b-befa-4490-8633-4e2c0f4c877b req-c9086433-f2d0-4378-96d2-aafb7e2db931 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:34 np0005538960 nova_compute[187252]: 2025-11-28 16:35:34.924 187256 DEBUG nova.compute.manager [req-e0031e4b-befa-4490-8633-4e2c0f4c877b req-c9086433-f2d0-4378-96d2-aafb7e2db931 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] No waiting events found dispatching network-vif-unplugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:35:34 np0005538960 nova_compute[187252]: 2025-11-28 16:35:34.924 187256 DEBUG nova.compute.manager [req-e0031e4b-befa-4490-8633-4e2c0f4c877b req-c9086433-f2d0-4378-96d2-aafb7e2db931 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-unplugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.659 187256 DEBUG nova.compute.manager [req-4c51aa3a-9d4b-4f3c-b9e0-7305748118bf req-26a2e361-bbe6-475e-a4ba-401ca2d22cfa 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.660 187256 DEBUG oslo_concurrency.lockutils [req-4c51aa3a-9d4b-4f3c-b9e0-7305748118bf req-26a2e361-bbe6-475e-a4ba-401ca2d22cfa 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.660 187256 DEBUG oslo_concurrency.lockutils [req-4c51aa3a-9d4b-4f3c-b9e0-7305748118bf req-26a2e361-bbe6-475e-a4ba-401ca2d22cfa 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.660 187256 DEBUG oslo_concurrency.lockutils [req-4c51aa3a-9d4b-4f3c-b9e0-7305748118bf req-26a2e361-bbe6-475e-a4ba-401ca2d22cfa 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.660 187256 DEBUG nova.compute.manager [req-4c51aa3a-9d4b-4f3c-b9e0-7305748118bf req-26a2e361-bbe6-475e-a4ba-401ca2d22cfa 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] No waiting events found dispatching network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.661 187256 WARNING nova.compute.manager [req-4c51aa3a-9d4b-4f3c-b9e0-7305748118bf req-26a2e361-bbe6-475e-a4ba-401ca2d22cfa 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received unexpected event network-vif-plugged-c2abe47d-718c-4e51-b661-823b9fa7add9 for instance with vm_state active and task_state deleting.#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.772 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.794 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.985 187256 DEBUG nova.network.neutron [req-84fac40d-6701-40f2-8687-edaedd4faf25 req-2d7615cb-4054-4a43-aa43-3cc3c288131b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updated VIF entry in instance network info cache for port c2abe47d-718c-4e51-b661-823b9fa7add9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:35:35 np0005538960 nova_compute[187252]: 2025-11-28 16:35:35.987 187256 DEBUG nova.network.neutron [req-84fac40d-6701-40f2-8687-edaedd4faf25 req-2d7615cb-4054-4a43-aa43-3cc3c288131b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updating instance_info_cache with network_info: [{"id": "c2abe47d-718c-4e51-b661-823b9fa7add9", "address": "fa:16:3e:24:f4:04", "network": {"id": "f167768e-3551-41c3-a3de-da1c1ed19a2c", "bridge": "br-int", "label": "tempest-network-smoke--1396042380", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2abe47d-71", "ovs_interfaceid": "c2abe47d-718c-4e51-b661-823b9fa7add9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "address": "fa:16:3e:3f:d1:2c", "network": {"id": "c053bf8e-59fa-4bf5-b9f4-2d0fb36f69da", "bridge": "br-int", "label": "tempest-network-smoke--557938854", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe3f:d12c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab0766c8-c7", "ovs_interfaceid": "ab0766c8-c7bc-4af1-9cc0-971475b014b7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.006 187256 DEBUG oslo_concurrency.lockutils [req-84fac40d-6701-40f2-8687-edaedd4faf25 req-2d7615cb-4054-4a43-aa43-3cc3c288131b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-b890bd95-f884-4215-91f2-749834092bc1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.608 187256 DEBUG nova.network.neutron [-] [instance: b890bd95-f884-4215-91f2-749834092bc1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.621 187256 INFO nova.compute.manager [-] [instance: b890bd95-f884-4215-91f2-749834092bc1] Took 3.37 seconds to deallocate network for instance.#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.669 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.670 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.733 187256 DEBUG nova.compute.provider_tree [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.748 187256 DEBUG nova.scheduler.client.report [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.776 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.808 187256 INFO nova.scheduler.client.report [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Deleted allocations for instance b890bd95-f884-4215-91f2-749834092bc1#033[00m
Nov 28 11:35:36 np0005538960 nova_compute[187252]: 2025-11-28 16:35:36.886 187256 DEBUG oslo_concurrency.lockutils [None req-3b422355-78da-4616-8a45-28b1242be620 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.013s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:37 np0005538960 nova_compute[187252]: 2025-11-28 16:35:37.060 187256 DEBUG nova.compute.manager [req-1c50e505-0850-4d2e-8c3e-2a175656d838 req-3e988e07-5bc8-4550-9bef-ebde73cdb9b3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:35:37 np0005538960 nova_compute[187252]: 2025-11-28 16:35:37.060 187256 DEBUG oslo_concurrency.lockutils [req-1c50e505-0850-4d2e-8c3e-2a175656d838 req-3e988e07-5bc8-4550-9bef-ebde73cdb9b3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "b890bd95-f884-4215-91f2-749834092bc1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:35:37 np0005538960 nova_compute[187252]: 2025-11-28 16:35:37.061 187256 DEBUG oslo_concurrency.lockutils [req-1c50e505-0850-4d2e-8c3e-2a175656d838 req-3e988e07-5bc8-4550-9bef-ebde73cdb9b3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:35:37 np0005538960 nova_compute[187252]: 2025-11-28 16:35:37.061 187256 DEBUG oslo_concurrency.lockutils [req-1c50e505-0850-4d2e-8c3e-2a175656d838 req-3e988e07-5bc8-4550-9bef-ebde73cdb9b3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "b890bd95-f884-4215-91f2-749834092bc1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:35:37 np0005538960 nova_compute[187252]: 2025-11-28 16:35:37.061 187256 DEBUG nova.compute.manager [req-1c50e505-0850-4d2e-8c3e-2a175656d838 req-3e988e07-5bc8-4550-9bef-ebde73cdb9b3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] No waiting events found dispatching network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:35:37 np0005538960 nova_compute[187252]: 2025-11-28 16:35:37.062 187256 WARNING nova.compute.manager [req-1c50e505-0850-4d2e-8c3e-2a175656d838 req-3e988e07-5bc8-4550-9bef-ebde73cdb9b3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received unexpected event network-vif-plugged-ab0766c8-c7bc-4af1-9cc0-971475b014b7 for instance with vm_state deleted and task_state None.#033[00m
Nov 28 11:35:37 np0005538960 nova_compute[187252]: 2025-11-28 16:35:37.062 187256 DEBUG nova.compute.manager [req-1c50e505-0850-4d2e-8c3e-2a175656d838 req-3e988e07-5bc8-4550-9bef-ebde73cdb9b3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-deleted-ab0766c8-c7bc-4af1-9cc0-971475b014b7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:35:37 np0005538960 nova_compute[187252]: 2025-11-28 16:35:37.062 187256 DEBUG nova.compute.manager [req-1c50e505-0850-4d2e-8c3e-2a175656d838 req-3e988e07-5bc8-4550-9bef-ebde73cdb9b3 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: b890bd95-f884-4215-91f2-749834092bc1] Received event network-vif-deleted-c2abe47d-718c-4e51-b661-823b9fa7add9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:35:38 np0005538960 nova_compute[187252]: 2025-11-28 16:35:38.197 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:38 np0005538960 nova_compute[187252]: 2025-11-28 16:35:38.726 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:41 np0005538960 podman[224236]: 2025-11-28 16:35:41.156709306 +0000 UTC m=+0.062584918 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 11:35:43 np0005538960 nova_compute[187252]: 2025-11-28 16:35:43.200 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:43 np0005538960 nova_compute[187252]: 2025-11-28 16:35:43.728 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:44 np0005538960 nova_compute[187252]: 2025-11-28 16:35:44.826 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:44 np0005538960 nova_compute[187252]: 2025-11-28 16:35:44.990 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:46 np0005538960 podman[224257]: 2025-11-28 16:35:46.160894078 +0000 UTC m=+0.065162908 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:35:48 np0005538960 nova_compute[187252]: 2025-11-28 16:35:48.158 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347733.1559777, b890bd95-f884-4215-91f2-749834092bc1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:35:48 np0005538960 nova_compute[187252]: 2025-11-28 16:35:48.159 187256 INFO nova.compute.manager [-] [instance: b890bd95-f884-4215-91f2-749834092bc1] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:35:48 np0005538960 nova_compute[187252]: 2025-11-28 16:35:48.184 187256 DEBUG nova.compute.manager [None req-2996a161-c581-40ab-aa1d-7eafba4540a2 - - - - - -] [instance: b890bd95-f884-4215-91f2-749834092bc1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:35:48 np0005538960 nova_compute[187252]: 2025-11-28 16:35:48.202 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:48 np0005538960 nova_compute[187252]: 2025-11-28 16:35:48.730 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:52 np0005538960 podman[224284]: 2025-11-28 16:35:52.191797631 +0000 UTC m=+0.098391099 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:35:53 np0005538960 nova_compute[187252]: 2025-11-28 16:35:53.204 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:53 np0005538960 nova_compute[187252]: 2025-11-28 16:35:53.732 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:55 np0005538960 podman[224312]: 2025-11-28 16:35:55.178493736 +0000 UTC m=+0.081850335 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:35:55 np0005538960 podman[224311]: 2025-11-28 16:35:55.182165165 +0000 UTC m=+0.091843598 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 11:35:58 np0005538960 nova_compute[187252]: 2025-11-28 16:35:58.206 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:35:58 np0005538960 nova_compute[187252]: 2025-11-28 16:35:58.734 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:01 np0005538960 podman[224346]: 2025-11-28 16:36:01.148610317 +0000 UTC m=+0.054123909 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:36:03 np0005538960 podman[224371]: 2025-11-28 16:36:03.161043209 +0000 UTC m=+0.069774210 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, version=9.6)
Nov 28 11:36:03 np0005538960 nova_compute[187252]: 2025-11-28 16:36:03.208 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:03 np0005538960 nova_compute[187252]: 2025-11-28 16:36:03.735 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:06.356 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:36:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:06.356 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:36:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:06.356 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:36:08 np0005538960 nova_compute[187252]: 2025-11-28 16:36:08.210 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:08 np0005538960 nova_compute[187252]: 2025-11-28 16:36:08.737 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:11 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:11.596 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:91:ba 10.100.0.2 2001:db8::f816:3eff:fe66:91ba'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe66:91ba/64', 'neutron:device_id': 'ovnmeta-a88eb275-720c-4f9e-895e-b616f8a3f43b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a88eb275-720c-4f9e-895e-b616f8a3f43b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07bf26b0-6904-4fb3-95d1-f124ba3153f8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dbd98595-3330-40e5-a555-d684d4581269) old=Port_Binding(mac=['fa:16:3e:66:91:ba 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-a88eb275-720c-4f9e-895e-b616f8a3f43b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a88eb275-720c-4f9e-895e-b616f8a3f43b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:36:11 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:11.597 104369 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dbd98595-3330-40e5-a555-d684d4581269 in datapath a88eb275-720c-4f9e-895e-b616f8a3f43b updated#033[00m
Nov 28 11:36:11 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:11.599 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a88eb275-720c-4f9e-895e-b616f8a3f43b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:36:11 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:11.600 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[01e62990-7396-4623-b449-02a3f1aeeeba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:36:12 np0005538960 podman[224392]: 2025-11-28 16:36:12.155167147 +0000 UTC m=+0.065073447 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Nov 28 11:36:13 np0005538960 nova_compute[187252]: 2025-11-28 16:36:13.212 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:13 np0005538960 nova_compute[187252]: 2025-11-28 16:36:13.739 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:17 np0005538960 podman[224412]: 2025-11-28 16:36:17.144206629 +0000 UTC m=+0.051914216 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:36:18 np0005538960 nova_compute[187252]: 2025-11-28 16:36:18.214 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:18 np0005538960 nova_compute[187252]: 2025-11-28 16:36:18.741 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:20.764 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:66:91:ba 10.100.0.2 2001:db8:0:1:f816:3eff:fe66:91ba 2001:db8::f816:3eff:fe66:91ba'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe66:91ba/64 2001:db8::f816:3eff:fe66:91ba/64', 'neutron:device_id': 'ovnmeta-a88eb275-720c-4f9e-895e-b616f8a3f43b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a88eb275-720c-4f9e-895e-b616f8a3f43b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=07bf26b0-6904-4fb3-95d1-f124ba3153f8, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dbd98595-3330-40e5-a555-d684d4581269) old=Port_Binding(mac=['fa:16:3e:66:91:ba 10.100.0.2 2001:db8::f816:3eff:fe66:91ba'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe66:91ba/64', 'neutron:device_id': 'ovnmeta-a88eb275-720c-4f9e-895e-b616f8a3f43b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a88eb275-720c-4f9e-895e-b616f8a3f43b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:36:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:20.765 104369 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dbd98595-3330-40e5-a555-d684d4581269 in datapath a88eb275-720c-4f9e-895e-b616f8a3f43b updated#033[00m
Nov 28 11:36:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:20.766 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a88eb275-720c-4f9e-895e-b616f8a3f43b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:36:20 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:20.767 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[42e2e4f8-4d76-4248-8973-a36edc3e25af]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:36:23 np0005538960 podman[224438]: 2025-11-28 16:36:23.187873592 +0000 UTC m=+0.096413579 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 11:36:23 np0005538960 nova_compute[187252]: 2025-11-28 16:36:23.216 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:23 np0005538960 nova_compute[187252]: 2025-11-28 16:36:23.743 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:26 np0005538960 podman[224467]: 2025-11-28 16:36:26.147660691 +0000 UTC m=+0.053308409 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 11:36:26 np0005538960 podman[224466]: 2025-11-28 16:36:26.15869947 +0000 UTC m=+0.068488579 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 11:36:26 np0005538960 nova_compute[187252]: 2025-11-28 16:36:26.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:27 np0005538960 nova_compute[187252]: 2025-11-28 16:36:27.327 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:27 np0005538960 nova_compute[187252]: 2025-11-28 16:36:27.327 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:27 np0005538960 nova_compute[187252]: 2025-11-28 16:36:27.328 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:36:28 np0005538960 nova_compute[187252]: 2025-11-28 16:36:28.218 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:28 np0005538960 nova_compute[187252]: 2025-11-28 16:36:28.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:28 np0005538960 nova_compute[187252]: 2025-11-28 16:36:28.745 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:29 np0005538960 nova_compute[187252]: 2025-11-28 16:36:29.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:29 np0005538960 nova_compute[187252]: 2025-11-28 16:36:29.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:36:29 np0005538960 nova_compute[187252]: 2025-11-28 16:36:29.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:36:29 np0005538960 nova_compute[187252]: 2025-11-28 16:36:29.327 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:36:30 np0005538960 nova_compute[187252]: 2025-11-28 16:36:30.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:30 np0005538960 nova_compute[187252]: 2025-11-28 16:36:30.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 11:36:30 np0005538960 nova_compute[187252]: 2025-11-28 16:36:30.331 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 11:36:31 np0005538960 nova_compute[187252]: 2025-11-28 16:36:31.332 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:32 np0005538960 podman[224507]: 2025-11-28 16:36:32.177972589 +0000 UTC m=+0.083215557 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:36:32 np0005538960 nova_compute[187252]: 2025-11-28 16:36:32.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.220 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.337 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.337 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.337 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.337 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.523 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.525 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5762MB free_disk=73.33790969848633GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.525 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.525 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.655 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.656 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:36:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:33.691 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.693 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:33 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:33.693 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.747 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.758 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:36:33 np0005538960 nova_compute[187252]: 2025-11-28 16:36:33.771 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:36:34 np0005538960 podman[224532]: 2025-11-28 16:36:34.148649545 +0000 UTC m=+0.058691781 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., version=9.6)
Nov 28 11:36:34 np0005538960 nova_compute[187252]: 2025-11-28 16:36:34.220 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:36:34 np0005538960 nova_compute[187252]: 2025-11-28 16:36:34.221 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.696s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:36:35 np0005538960 nova_compute[187252]: 2025-11-28 16:36:35.222 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.314 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.315 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:36:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:36:35 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:36:35.695 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:36:36 np0005538960 nova_compute[187252]: 2025-11-28 16:36:36.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:36 np0005538960 ovn_controller[95460]: 2025-11-28T16:36:36Z|00237|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Nov 28 11:36:37 np0005538960 nova_compute[187252]: 2025-11-28 16:36:37.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:36:37 np0005538960 nova_compute[187252]: 2025-11-28 16:36:37.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 11:36:38 np0005538960 nova_compute[187252]: 2025-11-28 16:36:38.222 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:38 np0005538960 nova_compute[187252]: 2025-11-28 16:36:38.749 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:43 np0005538960 podman[224553]: 2025-11-28 16:36:43.15225894 +0000 UTC m=+0.061438367 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:36:43 np0005538960 nova_compute[187252]: 2025-11-28 16:36:43.224 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:43 np0005538960 nova_compute[187252]: 2025-11-28 16:36:43.751 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:48 np0005538960 podman[224574]: 2025-11-28 16:36:48.158962703 +0000 UTC m=+0.056951288 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:36:48 np0005538960 nova_compute[187252]: 2025-11-28 16:36:48.226 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:48 np0005538960 nova_compute[187252]: 2025-11-28 16:36:48.754 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:53 np0005538960 nova_compute[187252]: 2025-11-28 16:36:53.259 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:53 np0005538960 nova_compute[187252]: 2025-11-28 16:36:53.756 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:54 np0005538960 podman[224599]: 2025-11-28 16:36:54.216109394 +0000 UTC m=+0.120731692 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 11:36:57 np0005538960 podman[224626]: 2025-11-28 16:36:57.150095586 +0000 UTC m=+0.054428497 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:36:57 np0005538960 podman[224625]: 2025-11-28 16:36:57.173058265 +0000 UTC m=+0.081536997 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:36:58 np0005538960 nova_compute[187252]: 2025-11-28 16:36:58.262 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:36:58 np0005538960 nova_compute[187252]: 2025-11-28 16:36:58.758 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:03 np0005538960 podman[224659]: 2025-11-28 16:37:03.160376556 +0000 UTC m=+0.062796500 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:37:03 np0005538960 nova_compute[187252]: 2025-11-28 16:37:03.265 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:03 np0005538960 nova_compute[187252]: 2025-11-28 16:37:03.759 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:05 np0005538960 podman[224683]: 2025-11-28 16:37:05.159103766 +0000 UTC m=+0.064564104 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_id=edpm, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41)
Nov 28 11:37:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:37:06.356 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:37:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:37:06.357 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:37:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:37:06.357 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:37:08 np0005538960 nova_compute[187252]: 2025-11-28 16:37:08.268 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:08 np0005538960 nova_compute[187252]: 2025-11-28 16:37:08.795 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:13 np0005538960 nova_compute[187252]: 2025-11-28 16:37:13.326 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:13 np0005538960 nova_compute[187252]: 2025-11-28 16:37:13.797 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:14 np0005538960 podman[224706]: 2025-11-28 16:37:14.152808611 +0000 UTC m=+0.055612225 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:37:18 np0005538960 nova_compute[187252]: 2025-11-28 16:37:18.328 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:18 np0005538960 nova_compute[187252]: 2025-11-28 16:37:18.799 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:19 np0005538960 podman[224730]: 2025-11-28 16:37:19.154775798 +0000 UTC m=+0.053716019 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:37:23 np0005538960 nova_compute[187252]: 2025-11-28 16:37:23.330 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:23 np0005538960 nova_compute[187252]: 2025-11-28 16:37:23.800 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:25 np0005538960 podman[224756]: 2025-11-28 16:37:25.257187571 +0000 UTC m=+0.084453268 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 11:37:28 np0005538960 podman[224782]: 2025-11-28 16:37:28.155295099 +0000 UTC m=+0.059578322 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:37:28 np0005538960 podman[224783]: 2025-11-28 16:37:28.160226919 +0000 UTC m=+0.059678265 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 11:37:28 np0005538960 nova_compute[187252]: 2025-11-28 16:37:28.332 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:28 np0005538960 nova_compute[187252]: 2025-11-28 16:37:28.333 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:28 np0005538960 nova_compute[187252]: 2025-11-28 16:37:28.802 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:29 np0005538960 nova_compute[187252]: 2025-11-28 16:37:29.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:29 np0005538960 nova_compute[187252]: 2025-11-28 16:37:29.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:37:29 np0005538960 nova_compute[187252]: 2025-11-28 16:37:29.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:37:29 np0005538960 nova_compute[187252]: 2025-11-28 16:37:29.326 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:37:29 np0005538960 nova_compute[187252]: 2025-11-28 16:37:29.327 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:29 np0005538960 nova_compute[187252]: 2025-11-28 16:37:29.327 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:37:30 np0005538960 nova_compute[187252]: 2025-11-28 16:37:30.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:31 np0005538960 nova_compute[187252]: 2025-11-28 16:37:31.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:33 np0005538960 nova_compute[187252]: 2025-11-28 16:37:33.334 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:33 np0005538960 nova_compute[187252]: 2025-11-28 16:37:33.804 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:34 np0005538960 podman[224821]: 2025-11-28 16:37:34.155669697 +0000 UTC m=+0.058588698 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.335 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.336 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.336 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.336 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.509 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.511 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5751MB free_disk=73.33801651000977GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.511 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.511 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.581 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.582 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.608 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.622 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.623 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:37:34 np0005538960 nova_compute[187252]: 2025-11-28 16:37:34.623 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:37:35 np0005538960 nova_compute[187252]: 2025-11-28 16:37:35.620 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:35 np0005538960 podman[224846]: 2025-11-28 16:37:35.712041519 +0000 UTC m=+0.062667036 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6)
Nov 28 11:37:36 np0005538960 nova_compute[187252]: 2025-11-28 16:37:36.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:38 np0005538960 nova_compute[187252]: 2025-11-28 16:37:38.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:37:38 np0005538960 nova_compute[187252]: 2025-11-28 16:37:38.337 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:38 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:37:38.686 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:37:38 np0005538960 nova_compute[187252]: 2025-11-28 16:37:38.685 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:38 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:37:38.687 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:37:38 np0005538960 nova_compute[187252]: 2025-11-28 16:37:38.806 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:37:41.689 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:37:43 np0005538960 nova_compute[187252]: 2025-11-28 16:37:43.340 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:43 np0005538960 nova_compute[187252]: 2025-11-28 16:37:43.809 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:45 np0005538960 podman[224870]: 2025-11-28 16:37:45.151259308 +0000 UTC m=+0.058863335 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:37:48 np0005538960 nova_compute[187252]: 2025-11-28 16:37:48.342 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:48 np0005538960 nova_compute[187252]: 2025-11-28 16:37:48.812 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:50 np0005538960 podman[224890]: 2025-11-28 16:37:50.158931904 +0000 UTC m=+0.059785677 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:37:53 np0005538960 nova_compute[187252]: 2025-11-28 16:37:53.344 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:53 np0005538960 nova_compute[187252]: 2025-11-28 16:37:53.813 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:56 np0005538960 podman[224916]: 2025-11-28 16:37:56.187984942 +0000 UTC m=+0.096417400 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Nov 28 11:37:58 np0005538960 nova_compute[187252]: 2025-11-28 16:37:58.347 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:58 np0005538960 nova_compute[187252]: 2025-11-28 16:37:58.815 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:37:59 np0005538960 podman[224942]: 2025-11-28 16:37:59.159297532 +0000 UTC m=+0.058226240 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:37:59 np0005538960 podman[224943]: 2025-11-28 16:37:59.166459586 +0000 UTC m=+0.059607753 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 11:38:03 np0005538960 nova_compute[187252]: 2025-11-28 16:38:03.351 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:03 np0005538960 nova_compute[187252]: 2025-11-28 16:38:03.839 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:05 np0005538960 podman[224983]: 2025-11-28 16:38:05.151104231 +0000 UTC m=+0.060411672 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:38:06 np0005538960 podman[225008]: 2025-11-28 16:38:06.161993146 +0000 UTC m=+0.066277755 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Nov 28 11:38:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:06.361 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:06.362 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:06.362 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:08 np0005538960 nova_compute[187252]: 2025-11-28 16:38:08.355 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:08 np0005538960 nova_compute[187252]: 2025-11-28 16:38:08.840 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:12 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:12.458 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:fd:52 10.100.0.2 2001:db8::f816:3eff:fe52:fd52'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe52:fd52/64', 'neutron:device_id': 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=17822ff8-9a5b-4363-9dcc-def524fd1d85, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ad292d62-0a33-4bcb-bba2-e947dea85e36) old=Port_Binding(mac=['fa:16:3e:52:fd:52 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:38:12 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:12.461 104369 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ad292d62-0a33-4bcb-bba2-e947dea85e36 in datapath af1be1f9-0fe9-4225-a7a3-625e96eb67d3 updated#033[00m
Nov 28 11:38:12 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:12.462 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af1be1f9-0fe9-4225-a7a3-625e96eb67d3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:38:12 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:12.463 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7af37cdc-100d-4084-bce2-447d0edf9242]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:13 np0005538960 nova_compute[187252]: 2025-11-28 16:38:13.358 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:13 np0005538960 nova_compute[187252]: 2025-11-28 16:38:13.841 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:16 np0005538960 podman[225030]: 2025-11-28 16:38:16.152996638 +0000 UTC m=+0.060423633 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:38:18 np0005538960 nova_compute[187252]: 2025-11-28 16:38:18.362 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:18.491 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:fd:52 10.100.0.2 2001:db8:0:1:f816:3eff:fe52:fd52 2001:db8::f816:3eff:fe52:fd52'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe52:fd52/64 2001:db8::f816:3eff:fe52:fd52/64', 'neutron:device_id': 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=17822ff8-9a5b-4363-9dcc-def524fd1d85, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ad292d62-0a33-4bcb-bba2-e947dea85e36) old=Port_Binding(mac=['fa:16:3e:52:fd:52 10.100.0.2 2001:db8::f816:3eff:fe52:fd52'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe52:fd52/64', 'neutron:device_id': 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:38:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:18.492 104369 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ad292d62-0a33-4bcb-bba2-e947dea85e36 in datapath af1be1f9-0fe9-4225-a7a3-625e96eb67d3 updated#033[00m
Nov 28 11:38:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:18.493 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af1be1f9-0fe9-4225-a7a3-625e96eb67d3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:38:18 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:18.494 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[14bbb28f-bbaf-45ba-ba1b-31f2fe6f14b7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:18 np0005538960 nova_compute[187252]: 2025-11-28 16:38:18.843 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:21 np0005538960 podman[225052]: 2025-11-28 16:38:21.150000034 +0000 UTC m=+0.058781793 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:38:23 np0005538960 nova_compute[187252]: 2025-11-28 16:38:23.366 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:23 np0005538960 nova_compute[187252]: 2025-11-28 16:38:23.846 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:25 np0005538960 nova_compute[187252]: 2025-11-28 16:38:25.924 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "95f49283-16da-4af6-b1a2-acba779ab5e4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:25 np0005538960 nova_compute[187252]: 2025-11-28 16:38:25.925 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:25 np0005538960 nova_compute[187252]: 2025-11-28 16:38:25.951 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.045 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.045 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.053 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.054 187256 INFO nova.compute.claims [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Claim successful on node compute-1.ctlplane.example.com#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.208 187256 DEBUG nova.compute.provider_tree [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.222 187256 DEBUG nova.scheduler.client.report [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.247 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.248 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.298 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.298 187256 DEBUG nova.network.neutron [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.321 187256 INFO nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.337 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.450 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.452 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.452 187256 INFO nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Creating image(s)#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.453 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "/var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.453 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "/var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.454 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "/var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.468 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.528 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.530 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.530 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.543 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.601 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.602 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.626 187256 DEBUG nova.policy [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.639 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc,backing_fmt=raw /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.640 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "a5028f54b566615edf539c536ce9ee5ddf1d51dc" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.641 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.696 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.698 187256 DEBUG nova.virt.disk.api [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Checking if we can resize image /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.698 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.755 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.756 187256 DEBUG nova.virt.disk.api [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Cannot resize image /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.757 187256 DEBUG nova.objects.instance [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'migration_context' on Instance uuid 95f49283-16da-4af6-b1a2-acba779ab5e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.769 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.769 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Ensure instance console log exists: /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.770 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.770 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:26 np0005538960 nova_compute[187252]: 2025-11-28 16:38:26.771 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:27 np0005538960 podman[225091]: 2025-11-28 16:38:27.178137419 +0000 UTC m=+0.077731374 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:38:27 np0005538960 nova_compute[187252]: 2025-11-28 16:38:27.577 187256 DEBUG nova.network.neutron [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Successfully created port: 56cebb30-3170-4bdb-bf92-e59e3d4a20ea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.368 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.773 187256 DEBUG nova.network.neutron [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Successfully updated port: 56cebb30-3170-4bdb-bf92-e59e3d4a20ea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.788 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.789 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquired lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.789 187256 DEBUG nova.network.neutron [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.848 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.867 187256 DEBUG nova.compute.manager [req-f9d5bfbd-77b2-47b9-9b72-9342a6258a54 req-8ebd7f08-3714-4806-ad83-44c034ca70d2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-changed-56cebb30-3170-4bdb-bf92-e59e3d4a20ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.867 187256 DEBUG nova.compute.manager [req-f9d5bfbd-77b2-47b9-9b72-9342a6258a54 req-8ebd7f08-3714-4806-ad83-44c034ca70d2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Refreshing instance network info cache due to event network-changed-56cebb30-3170-4bdb-bf92-e59e3d4a20ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.868 187256 DEBUG oslo_concurrency.lockutils [req-f9d5bfbd-77b2-47b9-9b72-9342a6258a54 req-8ebd7f08-3714-4806-ad83-44c034ca70d2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:38:28 np0005538960 nova_compute[187252]: 2025-11-28 16:38:28.907 187256 DEBUG nova.network.neutron [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Nov 28 11:38:29 np0005538960 nova_compute[187252]: 2025-11-28 16:38:29.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:29 np0005538960 nova_compute[187252]: 2025-11-28 16:38:29.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:38:29 np0005538960 nova_compute[187252]: 2025-11-28 16:38:29.971 187256 DEBUG nova.network.neutron [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updating instance_info_cache with network_info: [{"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.009 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Releasing lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.010 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Instance network_info: |[{"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.011 187256 DEBUG oslo_concurrency.lockutils [req-f9d5bfbd-77b2-47b9-9b72-9342a6258a54 req-8ebd7f08-3714-4806-ad83-44c034ca70d2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.011 187256 DEBUG nova.network.neutron [req-f9d5bfbd-77b2-47b9-9b72-9342a6258a54 req-8ebd7f08-3714-4806-ad83-44c034ca70d2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Refreshing network info cache for port 56cebb30-3170-4bdb-bf92-e59e3d4a20ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.015 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Start _get_guest_xml network_info=[{"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'guest_format': None, 'encrypted': False, 'size': 0, 'device_type': 'disk', 'device_name': '/dev/vda', 'encryption_secret_uuid': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encryption_format': None, 'encryption_options': None, 'image_id': '48a87826-de14-4dde-9157-9baf2160cd7d'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.020 187256 WARNING nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.025 187256 DEBUG nova.virt.libvirt.host [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.026 187256 DEBUG nova.virt.libvirt.host [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.030 187256 DEBUG nova.virt.libvirt.host [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.031 187256 DEBUG nova.virt.libvirt.host [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.032 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.033 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-11-28T16:17:17Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='c90217bd-1e89-4c68-8e01-33bf1cee456c',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-11-28T16:17:28Z,direct_url=<?>,disk_format='qcow2',id=48a87826-de14-4dde-9157-9baf2160cd7d,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0d115147376746f886db4c9ce486a477',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-11-28T16:17:29Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.033 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.033 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.033 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.034 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.034 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.034 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.034 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.034 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.035 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.035 187256 DEBUG nova.virt.hardware [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.039 187256 DEBUG nova.virt.libvirt.vif [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1667107488',display_name='tempest-TestGettingAddress-server-1667107488',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1667107488',id=53,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOXqe3Ilt1rqiRPLA/CKrfLDll5E0YCx6n+wUQEHt+Ya18tmZcz7kF+TnTzGspdxJJcrmGDhBmx5/KTuXA2lK6EhBo1O/JZqueRg/YORf1eKURqrPFu///5Uh18nq5Mg+A==',key_name='tempest-TestGettingAddress-1176227644',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-lwaplynr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:38:26Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=95f49283-16da-4af6-b1a2-acba779ab5e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.039 187256 DEBUG nova.network.os_vif_util [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.040 187256 DEBUG nova.network.os_vif_util [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:5f:7b,bridge_name='br-int',has_traffic_filtering=True,id=56cebb30-3170-4bdb-bf92-e59e3d4a20ea,network=Network(af1be1f9-0fe9-4225-a7a3-625e96eb67d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56cebb30-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.041 187256 DEBUG nova.objects.instance [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'pci_devices' on Instance uuid 95f49283-16da-4af6-b1a2-acba779ab5e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.056 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] End _get_guest_xml xml=<domain type="kvm">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <uuid>95f49283-16da-4af6-b1a2-acba779ab5e4</uuid>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <name>instance-00000035</name>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <memory>131072</memory>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <vcpu>1</vcpu>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <metadata>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <nova:name>tempest-TestGettingAddress-server-1667107488</nova:name>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <nova:creationTime>2025-11-28 16:38:30</nova:creationTime>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <nova:flavor name="m1.nano">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        <nova:memory>128</nova:memory>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        <nova:disk>1</nova:disk>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        <nova:swap>0</nova:swap>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        <nova:ephemeral>0</nova:ephemeral>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        <nova:vcpus>1</nova:vcpus>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      </nova:flavor>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <nova:owner>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        <nova:user uuid="23b8e0c173df4c2883fccd8cb472e427">tempest-TestGettingAddress-2054466537-project-member</nova:user>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        <nova:project uuid="b5f802fe6e0b4d62bba6143515207a40">tempest-TestGettingAddress-2054466537</nova:project>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      </nova:owner>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <nova:root type="image" uuid="48a87826-de14-4dde-9157-9baf2160cd7d"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <nova:ports>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        <nova:port uuid="56cebb30-3170-4bdb-bf92-e59e3d4a20ea">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe46:5f7b" ipVersion="6"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe46:5f7b" ipVersion="6"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:        </nova:port>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      </nova:ports>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </nova:instance>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  </metadata>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <sysinfo type="smbios">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <system>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <entry name="manufacturer">RDO</entry>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <entry name="product">OpenStack Compute</entry>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <entry name="serial">95f49283-16da-4af6-b1a2-acba779ab5e4</entry>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <entry name="uuid">95f49283-16da-4af6-b1a2-acba779ab5e4</entry>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <entry name="family">Virtual Machine</entry>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </system>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  </sysinfo>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <os>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <type arch="x86_64" machine="q35">hvm</type>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <boot dev="hd"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <smbios mode="sysinfo"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  </os>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <features>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <acpi/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <apic/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <vmcoreinfo/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  </features>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <clock offset="utc">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <timer name="pit" tickpolicy="delay"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <timer name="rtc" tickpolicy="catchup"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <timer name="hpet" present="no"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  </clock>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <cpu mode="custom" match="exact">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <model>Nehalem</model>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <topology sockets="1" cores="1" threads="1"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  </cpu>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  <devices>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <disk type="file" device="disk">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <driver name="qemu" type="qcow2" cache="none"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <target dev="vda" bus="virtio"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <disk type="file" device="cdrom">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <driver name="qemu" type="raw" cache="none"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <source file="/var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk.config"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <target dev="sda" bus="sata"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </disk>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <interface type="ethernet">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <mac address="fa:16:3e:46:5f:7b"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <driver name="vhost" rx_queue_size="512"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <mtu size="1442"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <target dev="tap56cebb30-31"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </interface>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <serial type="pty">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <log file="/var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/console.log" append="off"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </serial>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <video>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <model type="virtio"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </video>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <input type="tablet" bus="usb"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <rng model="virtio">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <backend model="random">/dev/urandom</backend>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </rng>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="pci" model="pcie-root-port"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <controller type="usb" index="0"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    <memballoon model="virtio">
Nov 28 11:38:30 np0005538960 nova_compute[187252]:      <stats period="10"/>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:    </memballoon>
Nov 28 11:38:30 np0005538960 nova_compute[187252]:  </devices>
Nov 28 11:38:30 np0005538960 nova_compute[187252]: </domain>
Nov 28 11:38:30 np0005538960 nova_compute[187252]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.058 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Preparing to wait for external event network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.058 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.058 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.058 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.059 187256 DEBUG nova.virt.libvirt.vif [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-11-28T16:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1667107488',display_name='tempest-TestGettingAddress-server-1667107488',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1667107488',id=53,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOXqe3Ilt1rqiRPLA/CKrfLDll5E0YCx6n+wUQEHt+Ya18tmZcz7kF+TnTzGspdxJJcrmGDhBmx5/KTuXA2lK6EhBo1O/JZqueRg/YORf1eKURqrPFu///5Uh18nq5Mg+A==',key_name='tempest-TestGettingAddress-1176227644',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-lwaplynr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-11-28T16:38:26Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=95f49283-16da-4af6-b1a2-acba779ab5e4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.059 187256 DEBUG nova.network.os_vif_util [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.060 187256 DEBUG nova.network.os_vif_util [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:5f:7b,bridge_name='br-int',has_traffic_filtering=True,id=56cebb30-3170-4bdb-bf92-e59e3d4a20ea,network=Network(af1be1f9-0fe9-4225-a7a3-625e96eb67d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56cebb30-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.061 187256 DEBUG os_vif [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:5f:7b,bridge_name='br-int',has_traffic_filtering=True,id=56cebb30-3170-4bdb-bf92-e59e3d4a20ea,network=Network(af1be1f9-0fe9-4225-a7a3-625e96eb67d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56cebb30-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.061 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.061 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.062 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.065 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.066 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56cebb30-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.066 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap56cebb30-31, col_values=(('external_ids', {'iface-id': '56cebb30-3170-4bdb-bf92-e59e3d4a20ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:5f:7b', 'vm-uuid': '95f49283-16da-4af6-b1a2-acba779ab5e4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.067 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 NetworkManager[55548]: <info>  [1764347910.0689] manager: (tap56cebb30-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/125)
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.070 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.076 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.078 187256 INFO os_vif [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:5f:7b,bridge_name='br-int',has_traffic_filtering=True,id=56cebb30-3170-4bdb-bf92-e59e3d4a20ea,network=Network(af1be1f9-0fe9-4225-a7a3-625e96eb67d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56cebb30-31')#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.128 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.129 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.129 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] No VIF found with MAC fa:16:3e:46:5f:7b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.129 187256 INFO nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Using config drive#033[00m
Nov 28 11:38:30 np0005538960 podman[225120]: 2025-11-28 16:38:30.157882715 +0000 UTC m=+0.061112629 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:38:30 np0005538960 podman[225122]: 2025-11-28 16:38:30.175108596 +0000 UTC m=+0.072314343 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2)
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.344 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.344 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.345 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.605 187256 INFO nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Creating config drive at /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk.config#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.612 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzsvwsrle execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.739 187256 DEBUG oslo_concurrency.processutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzsvwsrle" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:38:30 np0005538960 kernel: tap56cebb30-31: entered promiscuous mode
Nov 28 11:38:30 np0005538960 NetworkManager[55548]: <info>  [1764347910.8142] manager: (tap56cebb30-31): new Tun device (/org/freedesktop/NetworkManager/Devices/126)
Nov 28 11:38:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:30Z|00238|binding|INFO|Claiming lport 56cebb30-3170-4bdb-bf92-e59e3d4a20ea for this chassis.
Nov 28 11:38:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:30Z|00239|binding|INFO|56cebb30-3170-4bdb-bf92-e59e3d4a20ea: Claiming fa:16:3e:46:5f:7b 10.100.0.11 2001:db8:0:1:f816:3eff:fe46:5f7b 2001:db8::f816:3eff:fe46:5f7b
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.816 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.821 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.823 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.828 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.836 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:5f:7b 10.100.0.11 2001:db8:0:1:f816:3eff:fe46:5f7b 2001:db8::f816:3eff:fe46:5f7b'], port_security=['fa:16:3e:46:5f:7b 10.100.0.11 2001:db8:0:1:f816:3eff:fe46:5f7b 2001:db8::f816:3eff:fe46:5f7b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8:0:1:f816:3eff:fe46:5f7b/64 2001:db8::f816:3eff:fe46:5f7b/64', 'neutron:device_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '30d32d94-06c7-4d64-991b-715c9f46c5ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=17822ff8-9a5b-4363-9dcc-def524fd1d85, chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=56cebb30-3170-4bdb-bf92-e59e3d4a20ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.838 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 56cebb30-3170-4bdb-bf92-e59e3d4a20ea in datapath af1be1f9-0fe9-4225-a7a3-625e96eb67d3 bound to our chassis#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.839 104369 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network af1be1f9-0fe9-4225-a7a3-625e96eb67d3#033[00m
Nov 28 11:38:30 np0005538960 systemd-udevd[225176]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.852 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[cc9e8df4-9876-405f-baa1-20e8f4f185cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 systemd-machined[153518]: New machine qemu-19-instance-00000035.
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.853 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaf1be1f9-01 in ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.855 214244 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaf1be1f9-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.855 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[3d99f0e3-c8a9-474b-a64d-b59937e90aa4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.856 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[79312f12-c34e-420b-9969-4cdff3831116]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 NetworkManager[55548]: <info>  [1764347910.8684] device (tap56cebb30-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Nov 28 11:38:30 np0005538960 NetworkManager[55548]: <info>  [1764347910.8698] device (tap56cebb30-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.868 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[b286d42f-38d3-4aad-83d1-680911f72a0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 systemd[1]: Started Virtual Machine qemu-19-instance-00000035.
Nov 28 11:38:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:30Z|00240|binding|INFO|Setting lport 56cebb30-3170-4bdb-bf92-e59e3d4a20ea ovn-installed in OVS
Nov 28 11:38:30 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:30Z|00241|binding|INFO|Setting lport 56cebb30-3170-4bdb-bf92-e59e3d4a20ea up in Southbound
Nov 28 11:38:30 np0005538960 nova_compute[187252]: 2025-11-28 16:38:30.883 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.885 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[b89320d3-2ed1-4065-bbc5-973e12b49357]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.919 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9a4d62-084e-443b-b5c5-94d2fbcc0ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.927 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[37c6be57-d98d-4689-915c-8929ef7c2994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 NetworkManager[55548]: <info>  [1764347910.9285] manager: (tapaf1be1f9-00): new Veth device (/org/freedesktop/NetworkManager/Devices/127)
Nov 28 11:38:30 np0005538960 systemd-udevd[225179]: Network interface NamePolicy= disabled on kernel command line.
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.961 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[00353656-f4a3-4307-9df3-d7899019a858]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.965 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[e495e1bc-0f11-44a3-a8ed-78296cb57cf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:30 np0005538960 NetworkManager[55548]: <info>  [1764347910.9875] device (tapaf1be1f9-00): carrier: link connected
Nov 28 11:38:30 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:30.991 214278 DEBUG oslo.privsep.daemon [-] privsep: reply[ef587545-b0e3-4998-af4e-80ac9459ccf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.009 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[ea001288-0305-4421-bf34-ec32e28f93cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf1be1f9-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:fd:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501056, 'reachable_time': 33173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225208, 'error': None, 'target': 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.027 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a026336b-2ec6-4e88-9d5a-08ab0cf10141]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe52:fd52'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 501056, 'tstamp': 501056}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225209, 'error': None, 'target': 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.044 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[7c437ec0-7fc5-4082-bd21-12998b30022f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf1be1f9-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:52:fd:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 73], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501056, 'reachable_time': 33173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225210, 'error': None, 'target': 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.077 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[275088cc-2b84-4aaf-9cf0-aa2e03062ce7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.096 187256 DEBUG nova.compute.manager [req-3827d46b-e594-4451-a68e-5c1070793432 req-ca4271a9-b7eb-4d66-a4e5-759fc8eba538 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.096 187256 DEBUG oslo_concurrency.lockutils [req-3827d46b-e594-4451-a68e-5c1070793432 req-ca4271a9-b7eb-4d66-a4e5-759fc8eba538 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.097 187256 DEBUG oslo_concurrency.lockutils [req-3827d46b-e594-4451-a68e-5c1070793432 req-ca4271a9-b7eb-4d66-a4e5-759fc8eba538 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.097 187256 DEBUG oslo_concurrency.lockutils [req-3827d46b-e594-4451-a68e-5c1070793432 req-ca4271a9-b7eb-4d66-a4e5-759fc8eba538 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.097 187256 DEBUG nova.compute.manager [req-3827d46b-e594-4451-a68e-5c1070793432 req-ca4271a9-b7eb-4d66-a4e5-759fc8eba538 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Processing event network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.138 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[03a323ee-c354-4ad0-ad91-16cda446cf13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.141 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf1be1f9-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.141 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.142 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf1be1f9-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:38:31 np0005538960 kernel: tapaf1be1f9-00: entered promiscuous mode
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.144 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:31 np0005538960 NetworkManager[55548]: <info>  [1764347911.1457] manager: (tapaf1be1f9-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/128)
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.147 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaf1be1f9-00, col_values=(('external_ids', {'iface-id': 'ad292d62-0a33-4bcb-bba2-e947dea85e36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.148 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:31 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:31Z|00242|binding|INFO|Releasing lport ad292d62-0a33-4bcb-bba2-e947dea85e36 from this chassis (sb_readonly=0)
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.161 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.162 104369 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/af1be1f9-0fe9-4225-a7a3-625e96eb67d3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/af1be1f9-0fe9-4225-a7a3-625e96eb67d3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.163 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[67548fe9-7a4f-49d4-8ccb-ab5a9404622f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.164 104369 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: global
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    log         /dev/log local0 debug
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    log-tag     haproxy-metadata-proxy-af1be1f9-0fe9-4225-a7a3-625e96eb67d3
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    user        root
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    group       root
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    maxconn     1024
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    pidfile     /var/lib/neutron/external/pids/af1be1f9-0fe9-4225-a7a3-625e96eb67d3.pid.haproxy
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    daemon
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: defaults
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    log global
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    mode http
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    option httplog
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    option dontlognull
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    option http-server-close
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    option forwardfor
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    retries                 3
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    timeout http-request    30s
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    timeout connect         30s
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    timeout client          32s
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    timeout server          32s
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    timeout http-keep-alive 30s
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: listen listener
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    bind 169.254.169.254:80
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    server metadata /var/lib/neutron/metadata_proxy
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]:    http-request add-header X-OVN-Network-ID af1be1f9-0fe9-4225-a7a3-625e96eb67d3
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Nov 28 11:38:31 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:31.165 104369 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'env', 'PROCESS_TAG=haproxy-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/af1be1f9-0fe9-4225-a7a3-625e96eb67d3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:31 np0005538960 podman[225240]: 2025-11-28 16:38:31.563108887 +0000 UTC m=+0.063740054 container create 90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 11:38:31 np0005538960 systemd[1]: Started libpod-conmon-90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2.scope.
Nov 28 11:38:31 np0005538960 podman[225240]: 2025-11-28 16:38:31.528703998 +0000 UTC m=+0.029335195 image pull c64a92d8e8fa4f5fb5baf11a4a693a964be3868fb7e72462c6e612c604f8d071 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Nov 28 11:38:31 np0005538960 systemd[1]: Started libcrun container.
Nov 28 11:38:31 np0005538960 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22d4116022170ee938921a77ff47961e9e29a5772c9e793614d56244b9f1c46b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Nov 28 11:38:31 np0005538960 podman[225240]: 2025-11-28 16:38:31.667843428 +0000 UTC m=+0.168474615 container init 90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:38:31 np0005538960 podman[225240]: 2025-11-28 16:38:31.674270514 +0000 UTC m=+0.174901681 container start 90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0)
Nov 28 11:38:31 np0005538960 neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3[225255]: [NOTICE]   (225259) : New worker (225261) forked
Nov 28 11:38:31 np0005538960 neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3[225255]: [NOTICE]   (225259) : Loading success.
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.974 187256 DEBUG nova.network.neutron [req-f9d5bfbd-77b2-47b9-9b72-9342a6258a54 req-8ebd7f08-3714-4806-ad83-44c034ca70d2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updated VIF entry in instance network info cache for port 56cebb30-3170-4bdb-bf92-e59e3d4a20ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.975 187256 DEBUG nova.network.neutron [req-f9d5bfbd-77b2-47b9-9b72-9342a6258a54 req-8ebd7f08-3714-4806-ad83-44c034ca70d2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updating instance_info_cache with network_info: [{"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:38:31 np0005538960 nova_compute[187252]: 2025-11-28 16:38:31.993 187256 DEBUG oslo_concurrency.lockutils [req-f9d5bfbd-77b2-47b9-9b72-9342a6258a54 req-8ebd7f08-3714-4806-ad83-44c034ca70d2 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.019 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.020 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347912.0199294, 95f49283-16da-4af6-b1a2-acba779ab5e4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.020 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] VM Started (Lifecycle Event)#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.023 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.029 187256 INFO nova.virt.libvirt.driver [-] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Instance spawned successfully.#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.030 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.043 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.052 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.057 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.058 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.059 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.059 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.060 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.060 187256 DEBUG nova.virt.libvirt.driver [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.071 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.072 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347912.0200157, 95f49283-16da-4af6-b1a2-acba779ab5e4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.072 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] VM Paused (Lifecycle Event)#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.103 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.109 187256 DEBUG nova.virt.driver [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] Emitting event <LifecycleEvent: 1764347912.0229275, 95f49283-16da-4af6-b1a2-acba779ab5e4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.110 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] VM Resumed (Lifecycle Event)#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.125 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.130 187256 INFO nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Took 5.68 seconds to spawn the instance on the hypervisor.#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.130 187256 DEBUG nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.133 187256 DEBUG nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.165 187256 INFO nova.compute.manager [None req-c419dea5-57db-4ea0-8e4f-a11d0111538c - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.208 187256 INFO nova.compute.manager [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Took 6.20 seconds to build instance.#033[00m
Nov 28 11:38:32 np0005538960 nova_compute[187252]: 2025-11-28 16:38:32.223 187256 DEBUG oslo_concurrency.lockutils [None req-3baa9cc3-6ffd-4a30-9f2b-45a6bfbeae40 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:33 np0005538960 nova_compute[187252]: 2025-11-28 16:38:33.238 187256 DEBUG nova.compute.manager [req-736d6c9f-3acc-4084-8a4e-720cad75654f req-f4c526df-2198-4b98-abf9-d5fde3725174 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:38:33 np0005538960 nova_compute[187252]: 2025-11-28 16:38:33.239 187256 DEBUG oslo_concurrency.lockutils [req-736d6c9f-3acc-4084-8a4e-720cad75654f req-f4c526df-2198-4b98-abf9-d5fde3725174 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:33 np0005538960 nova_compute[187252]: 2025-11-28 16:38:33.239 187256 DEBUG oslo_concurrency.lockutils [req-736d6c9f-3acc-4084-8a4e-720cad75654f req-f4c526df-2198-4b98-abf9-d5fde3725174 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:33 np0005538960 nova_compute[187252]: 2025-11-28 16:38:33.239 187256 DEBUG oslo_concurrency.lockutils [req-736d6c9f-3acc-4084-8a4e-720cad75654f req-f4c526df-2198-4b98-abf9-d5fde3725174 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:33 np0005538960 nova_compute[187252]: 2025-11-28 16:38:33.240 187256 DEBUG nova.compute.manager [req-736d6c9f-3acc-4084-8a4e-720cad75654f req-f4c526df-2198-4b98-abf9-d5fde3725174 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] No waiting events found dispatching network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:38:33 np0005538960 nova_compute[187252]: 2025-11-28 16:38:33.240 187256 WARNING nova.compute.manager [req-736d6c9f-3acc-4084-8a4e-720cad75654f req-f4c526df-2198-4b98-abf9-d5fde3725174 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received unexpected event network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea for instance with vm_state active and task_state None.#033[00m
Nov 28 11:38:33 np0005538960 nova_compute[187252]: 2025-11-28 16:38:33.850 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:34 np0005538960 nova_compute[187252]: 2025-11-28 16:38:34.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.072 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.320 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'name': 'tempest-TestGettingAddress-server-1667107488', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000035', 'OS-EXT-SRV-ATTR:host': 'compute-1.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'b5f802fe6e0b4d62bba6143515207a40', 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'hostId': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.337 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.339 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.340 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.340 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.340 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b063706-2438-4a44-9929-eadbcecf40fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.321585', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af0b6940-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.955747704, 'message_signature': 'd5dcc406874edeb59a22eb477c55f558246ebc0933739b71dd1a34c56b1637e7'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.321585', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af0b7ac0-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.955747704, 'message_signature': '88ae5930f44a9274bd13b1dbfb2448032eb2d5e079330f5302918cfad135b01b'}]}, 'timestamp': '2025-11-28 16:38:35.339359', '_unique_id': 'ee99a7891f174230b3f439e97d94c1fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.340 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.342 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.342 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.342 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd652e848-28ac-4653-a845-de44e2ed1ac8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.342552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af0c04d6-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.955747704, 'message_signature': 'c8de377011e27c9083935d35c092c6fce3ac46f7caacfa9803246d58ec70e85b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.342552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af0c12f0-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.955747704, 'message_signature': 'afe58db7c0b7a331a98f3b6c5028dbf5efcdecf12de58f14e9946892265a91fa'}]}, 'timestamp': '2025-11-28 16:38:35.343264', '_unique_id': '0f668ccec3a241d3a25e3926f66d03e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.344 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.345 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.345 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.346 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1667107488>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1667107488>]
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.346 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.392 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.read.latency volume: 137504566 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.393 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.read.latency volume: 492573 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9612b76-6e06-471c-9db6-57c03d150b5b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 137504566, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.346417', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af13b2c6-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': '27793d82f033f5bd474ba0857843dcf7f4c5c980f3144820555f44033dade70c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 492573, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.346417', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af13bf8c-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': '1bb928a990005d5e3fca4b675983b4e91124775a16ca8a899d7ccdfeea3f3bc7'}]}, 'timestamp': '2025-11-28 16:38:35.393603', '_unique_id': '2c6a0f702e1445bba522afbda0626edf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.394 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.395 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.398 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 95f49283-16da-4af6-b1a2-acba779ab5e4 / tap56cebb30-31 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.398 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '300cb488-91cd-4bd7-a8f1-01397dd7e832', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.395616', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af149466-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': 'ff9c0a22c2be3fac69c2a472d9906a54f3e3e4e66d1cb96285bd87c2d6fc3bd0'}]}, 'timestamp': '2025-11-28 16:38:35.399018', '_unique_id': '62723578533d4c1a9e6fdf0c4a82cd32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.399 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.400 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.400 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.400 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1667107488>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1667107488>]
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.400 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.400 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.401 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b8d198a-cd66-40da-8641-a3aca2fc7c01', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.400960', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af14ec7c-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': '47cfcd9697a90d8baa2be9bd9b18d88b6e8456d49c6c4a35f269b746fec568e2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.400960', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af14f492-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': 'e19c8752fdcf55937668807376835172c966fab6ae4c1b9619ae2ea565b2644e'}]}, 'timestamp': '2025-11-28 16:38:35.401434', '_unique_id': '7530274c34d0472d9bbf4879416b0687'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.402 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'efdb7a00-e960-41b3-91aa-771c9319e74b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.402797', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af1533da-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': '3bd79cd1b71f071c8b99405d6fcd27a5ba029be2b92620d365a4a4000e54e2f1'}]}, 'timestamp': '2025-11-28 16:38:35.403082', '_unique_id': '7a89b0a0d8b7461181053b6737c61bff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.403 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.404 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.404 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74466288-4194-4d51-a67d-124a8f35fd3c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.404259', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af156cc4-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': '8f9b5df59c48338fc635a48baa8185d516d6df229d5e9b6fa4396aa4527b2704'}]}, 'timestamp': '2025-11-28 16:38:35.404540', '_unique_id': 'aabf6588ce9945d6b98d999abd381e2a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.405 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ade4035f-10ec-483f-9c8b-b7eb401ed589', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.405847', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af15e582-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': '1bb37c2268d0981fbe723ab8df68927b76d7d867bfe6c0b2bfe882434bf44e2b'}]}, 'timestamp': '2025-11-28 16:38:35.407654', '_unique_id': '5a571ddc730d4f0687d9c379df241397'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.408 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ab9a939f-744e-43ac-bb2c-52d515f0c928', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.409020', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af1626c8-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': 'ac54aaae6dd9159815c1bca752abd71263b4cdcd904ce81ff6e917b259c6fc49'}]}, 'timestamp': '2025-11-28 16:38:35.409285', '_unique_id': '6ba517da85704f09999fc04083119929'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.409 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.410 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.410 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '232e4cc2-8880-478b-981f-188162799582', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.410770', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af1669e4-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': '715ee4e6ebc0acb4ca3c6a6a6f1b981d1e9f7410479e7dcc7dd4952e6174f599'}]}, 'timestamp': '2025-11-28 16:38:35.411044', '_unique_id': '803fe5b19d824a7fb9c24f1c10f3ca9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.411 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.412 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.425 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.434 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.435 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 95f49283-16da-4af6-b1a2-acba779ab5e4: ceilometer.compute.pollsters.NoVolumeException
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.435 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.435 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6f6037b8-2f11-4451-84f1-874f345cd65b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.435407', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af1a2f7a-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': 'c3fb9d615f74420beedf583b2868a734963823b4c86b7783587297a2b42e94ec'}]}, 'timestamp': '2025-11-28 16:38:35.435830', '_unique_id': '10ff6e8558c546dd9c214a7b9f5a11ed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.436 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.437 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.437 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af2759ad-0190-40fa-972f-49248c207f75', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.437640', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af1a84a2-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': 'cdde0fd52f7f016c13833686a4f52a9f29ce7fe6296e50929012115398313a4d'}]}, 'timestamp': '2025-11-28 16:38:35.437909', '_unique_id': 'ad97132878494be193c8e80d890676ca'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.438 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.439 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.439 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.439 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '756dd02e-ba32-45de-b5c2-e78956717463', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.439627', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af1ad132-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': '920efc7545fee6ea739c628269d11b12f7ccf2df2973ce49520ae3a9c4eb3ea4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.439627', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af1ada60-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': '0c37b13415b022e0928fc425538863f2ea71c068ac0ccf900ec96fc806ef7b88'}]}, 'timestamp': '2025-11-28 16:38:35.440079', '_unique_id': 'e75a2ea2a6024b33af552f1bd9281201'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.440 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.441 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.441 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '102960a5-1be1-43e8-9555-67f7cd531eaa', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.441725', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af1b2420-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': 'ca77628ac33129c89d9746b70f2826a0df861b09c47b98ae504c8a732d4d1b58'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.441725', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af1b2d8a-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': '2c1665b7d9d3dd23c7d48410581179e6504467b680c3c1a6222a62a8d71e94f1'}]}, 'timestamp': '2025-11-28 16:38:35.442213', '_unique_id': '7533f1595b214e6f84ed3bfd30aec2cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.442 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.445 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.445 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.445 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7f7c72be-f42a-4154-b0bf-7754ce51f8a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.445326', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af1bb5ac-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': '5b9b3a5d8d6b747457de0c607ec4d962c87dad79d4d3a0196ea53ad440d4d22d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.445326', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af1bc2fe-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': '97c12e9a1bac67c4aace990d8ca80867eecf3ed3cff7728605630232dcff31f9'}]}, 'timestamp': '2025-11-28 16:38:35.446062', '_unique_id': '024389b8c21f430eb66f7ad489cb0ede'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.446 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.447 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.447 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2041580-29b6-4507-bb8e-08d21612af2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.447677', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af1c0eda-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': '96f422e7cb1bb85b1340cb5228ffb325ea00852ed353251272d26ed595613b68'}]}, 'timestamp': '2025-11-28 16:38:35.448054', '_unique_id': '907bb5cf0549405a8013ce20616ce738'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.448 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.449 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.449 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.450 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1667107488>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1667107488>]
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.450 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.450 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.450 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1667107488>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1667107488>]
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.451 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.451 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '101c0c90-629e-4831-8960-fb6617b2585c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.451032', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af1c9148-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': 'e41e095d1ece928abc39f5284fd36f426a43159c4132841056637ad42357a3d3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.451032', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af1c9c60-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.98060928, 'message_signature': 'd83eddf51a68a30dc45b7dad08eb56b7cb0c22ce582a05dfc75736a2125e0a60'}]}, 'timestamp': '2025-11-28 16:38:35.451625', '_unique_id': '646b2b55c5004b6bb65235dbe33761fa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.452 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.453 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.453 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5c389059-b8bb-4981-9e0b-2fb3a508efd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': 'instance-00000035-95f49283-16da-4af6-b1a2-acba779ab5e4-tap56cebb30-31', 'timestamp': '2025-11-28T16:38:35.453352', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'tap56cebb30-31', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:46:5f:7b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap56cebb30-31'}, 'message_id': 'af1ceaee-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.029801168, 'message_signature': '1fd38a3ac5c9f378078f6f2fc96dcc78e967d2514707a20d28973ef2c1e1fabe'}]}, 'timestamp': '2025-11-28 16:38:35.453626', '_unique_id': '1cfa4645c8534aa3bd1455ffc1bd4e96'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.454 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.455 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.455 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.455 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2425f1d2-beac-4d25-a88f-5af2e2ee6bb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-vda', 'timestamp': '2025-11-28T16:38:35.455147', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af1d3152-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.955747704, 'message_signature': 'fd9015233671d71ef0468f0160b9c5902de5be7b908cc165b8f05bc6a395f901'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4-sda', 'timestamp': '2025-11-28T16:38:35.455147', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af1d3ab2-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5014.955747704, 'message_signature': '19c55dd02913654ab6425bdf65c40812fa6f475e222db06dce67794ec618743e'}]}, 'timestamp': '2025-11-28 16:38:35.455653', '_unique_id': 'd463e4ad057b4f1e986313a2308ef315'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.456 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.457 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.457 12 DEBUG ceilometer.compute.pollsters [-] 95f49283-16da-4af6-b1a2-acba779ab5e4/cpu volume: 3220000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Nov 28 11:38:35 np0005538960 podman[225277]: 2025-11-28 16:38:35.457662867 +0000 UTC m=+0.064095431 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1fe7cdc8-21de-46f9-92dc-95e0225ac016', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3220000000, 'user_id': '23b8e0c173df4c2883fccd8cb472e427', 'user_name': None, 'project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'project_name': None, 'resource_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'timestamp': '2025-11-28T16:38:35.457678', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1667107488', 'name': 'instance-00000035', 'instance_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'instance_type': 'm1.nano', 'host': '73b7d09f881f78131e4bf808f80ef30151b3a0714cc039c2aa418af9', 'instance_host': 'compute-1.ctlplane.example.com', 'flavor': {'id': 'c90217bd-1e89-4c68-8e01-33bf1cee456c', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '48a87826-de14-4dde-9157-9baf2160cd7d'}, 'image_ref': '48a87826-de14-4dde-9157-9baf2160cd7d', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'af1d9908-cc78-11f0-bcca-fa163efe7585', 'monotonic_time': 5015.06887572, 'message_signature': '0d0406ad3514f3c688dd708f00730698a0bd56ebaf0100a3199059c3e1a69ce6'}]}, 'timestamp': '2025-11-28 16:38:35.458194', '_unique_id': '6dcfa92000104915bb4146c63cd22df4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     yield
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Nov 28 11:38:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:38:35.459 12 ERROR oslo_messaging.notify.messaging 
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.486 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.487 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.553 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.729 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.734 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5556MB free_disk=73.33644104003906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.735 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.735 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.824 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance 95f49283-16da-4af6-b1a2-acba779ab5e4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.825 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.825 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.869 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.884 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.904 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:38:35 np0005538960 nova_compute[187252]: 2025-11-28 16:38:35.904 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:38:37 np0005538960 podman[225308]: 2025-11-28 16:38:37.151130821 +0000 UTC m=+0.055675388 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.openshift.expose-services=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:38:38 np0005538960 nova_compute[187252]: 2025-11-28 16:38:38.858 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:38 np0005538960 nova_compute[187252]: 2025-11-28 16:38:38.905 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:38 np0005538960 nova_compute[187252]: 2025-11-28 16:38:38.905 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:38:40 np0005538960 nova_compute[187252]: 2025-11-28 16:38:40.076 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:43 np0005538960 nova_compute[187252]: 2025-11-28 16:38:43.862 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:44 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:44Z|00243|binding|INFO|Releasing lport ad292d62-0a33-4bcb-bba2-e947dea85e36 from this chassis (sb_readonly=0)
Nov 28 11:38:44 np0005538960 nova_compute[187252]: 2025-11-28 16:38:44.241 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:44 np0005538960 NetworkManager[55548]: <info>  [1764347924.2425] manager: (patch-provnet-64793cf2-ef01-4631-9a1a-81334065efe2-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Nov 28 11:38:44 np0005538960 NetworkManager[55548]: <info>  [1764347924.2439] manager: (patch-br-int-to-provnet-64793cf2-ef01-4631-9a1a-81334065efe2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Nov 28 11:38:44 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:44Z|00244|binding|INFO|Releasing lport ad292d62-0a33-4bcb-bba2-e947dea85e36 from this chassis (sb_readonly=0)
Nov 28 11:38:44 np0005538960 nova_compute[187252]: 2025-11-28 16:38:44.271 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:44 np0005538960 nova_compute[187252]: 2025-11-28 16:38:44.279 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:45 np0005538960 nova_compute[187252]: 2025-11-28 16:38:45.122 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:45 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:45Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:46:5f:7b 10.100.0.11
Nov 28 11:38:45 np0005538960 ovn_controller[95460]: 2025-11-28T16:38:45Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:46:5f:7b 10.100.0.11
Nov 28 11:38:46 np0005538960 nova_compute[187252]: 2025-11-28 16:38:46.439 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:46.439 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:38:46 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:46.441 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:38:46 np0005538960 podman[225357]: 2025-11-28 16:38:46.701840756 +0000 UTC m=+0.060765531 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:38:46 np0005538960 nova_compute[187252]: 2025-11-28 16:38:46.755 187256 DEBUG nova.compute.manager [req-59ceae70-881a-4571-90fc-29c2d7e6c3ba req-3bdeb661-c2fd-4f35-aea9-835995cb8381 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-changed-56cebb30-3170-4bdb-bf92-e59e3d4a20ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:38:46 np0005538960 nova_compute[187252]: 2025-11-28 16:38:46.756 187256 DEBUG nova.compute.manager [req-59ceae70-881a-4571-90fc-29c2d7e6c3ba req-3bdeb661-c2fd-4f35-aea9-835995cb8381 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Refreshing instance network info cache due to event network-changed-56cebb30-3170-4bdb-bf92-e59e3d4a20ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:38:46 np0005538960 nova_compute[187252]: 2025-11-28 16:38:46.757 187256 DEBUG oslo_concurrency.lockutils [req-59ceae70-881a-4571-90fc-29c2d7e6c3ba req-3bdeb661-c2fd-4f35-aea9-835995cb8381 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:38:46 np0005538960 nova_compute[187252]: 2025-11-28 16:38:46.757 187256 DEBUG oslo_concurrency.lockutils [req-59ceae70-881a-4571-90fc-29c2d7e6c3ba req-3bdeb661-c2fd-4f35-aea9-835995cb8381 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:38:46 np0005538960 nova_compute[187252]: 2025-11-28 16:38:46.757 187256 DEBUG nova.network.neutron [req-59ceae70-881a-4571-90fc-29c2d7e6c3ba req-3bdeb661-c2fd-4f35-aea9-835995cb8381 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Refreshing network info cache for port 56cebb30-3170-4bdb-bf92-e59e3d4a20ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:38:48 np0005538960 nova_compute[187252]: 2025-11-28 16:38:48.864 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:50 np0005538960 nova_compute[187252]: 2025-11-28 16:38:50.126 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:52 np0005538960 podman[225378]: 2025-11-28 16:38:52.173897274 +0000 UTC m=+0.080465951 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:38:52 np0005538960 nova_compute[187252]: 2025-11-28 16:38:52.625 187256 DEBUG nova.network.neutron [req-59ceae70-881a-4571-90fc-29c2d7e6c3ba req-3bdeb661-c2fd-4f35-aea9-835995cb8381 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updated VIF entry in instance network info cache for port 56cebb30-3170-4bdb-bf92-e59e3d4a20ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:38:52 np0005538960 nova_compute[187252]: 2025-11-28 16:38:52.627 187256 DEBUG nova.network.neutron [req-59ceae70-881a-4571-90fc-29c2d7e6c3ba req-3bdeb661-c2fd-4f35-aea9-835995cb8381 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updating instance_info_cache with network_info: [{"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:38:53 np0005538960 nova_compute[187252]: 2025-11-28 16:38:53.371 187256 DEBUG oslo_concurrency.lockutils [req-59ceae70-881a-4571-90fc-29c2d7e6c3ba req-3bdeb661-c2fd-4f35-aea9-835995cb8381 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:38:53 np0005538960 nova_compute[187252]: 2025-11-28 16:38:53.915 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:38:54.444 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:38:55 np0005538960 nova_compute[187252]: 2025-11-28 16:38:55.136 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:38:58 np0005538960 podman[225402]: 2025-11-28 16:38:58.182199188 +0000 UTC m=+0.087914012 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:38:58 np0005538960 nova_compute[187252]: 2025-11-28 16:38:58.918 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:00 np0005538960 nova_compute[187252]: 2025-11-28 16:39:00.168 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:01 np0005538960 podman[225431]: 2025-11-28 16:39:01.157153127 +0000 UTC m=+0.053514555 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 11:39:01 np0005538960 podman[225430]: 2025-11-28 16:39:01.164864175 +0000 UTC m=+0.062144514 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:39:03 np0005538960 nova_compute[187252]: 2025-11-28 16:39:03.920 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:05 np0005538960 nova_compute[187252]: 2025-11-28 16:39:05.172 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:06 np0005538960 podman[225466]: 2025-11-28 16:39:06.143808972 +0000 UTC m=+0.050980202 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:39:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:06.362 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:39:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:06.363 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:39:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:06.364 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:39:08 np0005538960 podman[225490]: 2025-11-28 16:39:08.151080171 +0000 UTC m=+0.060099366 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.6, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 11:39:08 np0005538960 nova_compute[187252]: 2025-11-28 16:39:08.923 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:10 np0005538960 nova_compute[187252]: 2025-11-28 16:39:10.176 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:13 np0005538960 nova_compute[187252]: 2025-11-28 16:39:13.926 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:15 np0005538960 nova_compute[187252]: 2025-11-28 16:39:15.179 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:17 np0005538960 podman[225512]: 2025-11-28 16:39:17.16414958 +0000 UTC m=+0.068491370 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:39:18 np0005538960 nova_compute[187252]: 2025-11-28 16:39:18.929 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:20 np0005538960 nova_compute[187252]: 2025-11-28 16:39:20.183 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:23 np0005538960 podman[225536]: 2025-11-28 16:39:23.153401487 +0000 UTC m=+0.058210648 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:39:23 np0005538960 nova_compute[187252]: 2025-11-28 16:39:23.931 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:25 np0005538960 nova_compute[187252]: 2025-11-28 16:39:25.206 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:28 np0005538960 nova_compute[187252]: 2025-11-28 16:39:28.933 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:29 np0005538960 podman[225561]: 2025-11-28 16:39:29.219388474 +0000 UTC m=+0.118852266 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125)
Nov 28 11:39:30 np0005538960 nova_compute[187252]: 2025-11-28 16:39:30.209 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:30 np0005538960 nova_compute[187252]: 2025-11-28 16:39:30.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:30 np0005538960 nova_compute[187252]: 2025-11-28 16:39:30.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:39:30 np0005538960 nova_compute[187252]: 2025-11-28 16:39:30.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:39:30 np0005538960 nova_compute[187252]: 2025-11-28 16:39:30.740 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:39:30 np0005538960 nova_compute[187252]: 2025-11-28 16:39:30.741 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquired lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:39:30 np0005538960 nova_compute[187252]: 2025-11-28 16:39:30.741 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Nov 28 11:39:30 np0005538960 nova_compute[187252]: 2025-11-28 16:39:30.741 187256 DEBUG nova.objects.instance [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 95f49283-16da-4af6-b1a2-acba779ab5e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:39:32 np0005538960 podman[225588]: 2025-11-28 16:39:32.158434469 +0000 UTC m=+0.063708373 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:39:32 np0005538960 podman[225587]: 2025-11-28 16:39:32.160147651 +0000 UTC m=+0.068612383 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, org.label-schema.build-date=20251125, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:39:32 np0005538960 nova_compute[187252]: 2025-11-28 16:39:32.473 187256 DEBUG nova.network.neutron [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updating instance_info_cache with network_info: [{"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:39:32 np0005538960 nova_compute[187252]: 2025-11-28 16:39:32.487 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Releasing lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:39:32 np0005538960 nova_compute[187252]: 2025-11-28 16:39:32.487 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Nov 28 11:39:32 np0005538960 nova_compute[187252]: 2025-11-28 16:39:32.487 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:32 np0005538960 nova_compute[187252]: 2025-11-28 16:39:32.488 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:32 np0005538960 nova_compute[187252]: 2025-11-28 16:39:32.488 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:32 np0005538960 nova_compute[187252]: 2025-11-28 16:39:32.488 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:32 np0005538960 nova_compute[187252]: 2025-11-28 16:39:32.488 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:39:33 np0005538960 nova_compute[187252]: 2025-11-28 16:39:33.936 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.211 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.339 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.407 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.473 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.474 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.537 187256 DEBUG oslo_concurrency.processutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.704 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.706 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5600MB free_disk=73.30867385864258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.707 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.707 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.800 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Instance 95f49283-16da-4af6-b1a2-acba779ab5e4 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.800 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.800 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.849 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.888 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.891 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:39:35 np0005538960 nova_compute[187252]: 2025-11-28 16:39:35.891 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:39:36 np0005538960 nova_compute[187252]: 2025-11-28 16:39:36.887 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:37 np0005538960 podman[225630]: 2025-11-28 16:39:37.138081751 +0000 UTC m=+0.048100103 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:39:38 np0005538960 nova_compute[187252]: 2025-11-28 16:39:38.939 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:39 np0005538960 podman[225654]: 2025-11-28 16:39:39.156264884 +0000 UTC m=+0.065994398 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Nov 28 11:39:39 np0005538960 nova_compute[187252]: 2025-11-28 16:39:39.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:39 np0005538960 nova_compute[187252]: 2025-11-28 16:39:39.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:39:40 np0005538960 nova_compute[187252]: 2025-11-28 16:39:40.217 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:43 np0005538960 nova_compute[187252]: 2025-11-28 16:39:43.944 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:45 np0005538960 nova_compute[187252]: 2025-11-28 16:39:45.243 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:48 np0005538960 podman[225678]: 2025-11-28 16:39:48.187480135 +0000 UTC m=+0.095875176 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2)
Nov 28 11:39:48 np0005538960 nova_compute[187252]: 2025-11-28 16:39:48.945 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:49 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:49.951 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:39:49 np0005538960 nova_compute[187252]: 2025-11-28 16:39:49.952 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:49 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:49.953 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:39:50 np0005538960 nova_compute[187252]: 2025-11-28 16:39:50.246 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:53 np0005538960 nova_compute[187252]: 2025-11-28 16:39:53.947 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:54 np0005538960 podman[225699]: 2025-11-28 16:39:54.154167863 +0000 UTC m=+0.057737488 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:39:54 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:54.956 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:39:55 np0005538960 nova_compute[187252]: 2025-11-28 16:39:55.250 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.102 187256 DEBUG nova.compute.manager [req-a1b42bfb-9d26-4634-82e1-9e9be8341bf3 req-e0299907-2a44-4d07-87f0-360bc41d573f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-changed-56cebb30-3170-4bdb-bf92-e59e3d4a20ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.102 187256 DEBUG nova.compute.manager [req-a1b42bfb-9d26-4634-82e1-9e9be8341bf3 req-e0299907-2a44-4d07-87f0-360bc41d573f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Refreshing instance network info cache due to event network-changed-56cebb30-3170-4bdb-bf92-e59e3d4a20ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.102 187256 DEBUG oslo_concurrency.lockutils [req-a1b42bfb-9d26-4634-82e1-9e9be8341bf3 req-e0299907-2a44-4d07-87f0-360bc41d573f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.102 187256 DEBUG oslo_concurrency.lockutils [req-a1b42bfb-9d26-4634-82e1-9e9be8341bf3 req-e0299907-2a44-4d07-87f0-360bc41d573f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquired lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.103 187256 DEBUG nova.network.neutron [req-a1b42bfb-9d26-4634-82e1-9e9be8341bf3 req-e0299907-2a44-4d07-87f0-360bc41d573f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Refreshing network info cache for port 56cebb30-3170-4bdb-bf92-e59e3d4a20ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.644 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "95f49283-16da-4af6-b1a2-acba779ab5e4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.645 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.645 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.646 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.646 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.648 187256 INFO nova.compute.manager [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Terminating instance#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.649 187256 DEBUG nova.compute.manager [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Nov 28 11:39:56 np0005538960 kernel: tap56cebb30-31 (unregistering): left promiscuous mode
Nov 28 11:39:56 np0005538960 NetworkManager[55548]: <info>  [1764347996.6707] device (tap56cebb30-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Nov 28 11:39:56 np0005538960 ovn_controller[95460]: 2025-11-28T16:39:56Z|00245|binding|INFO|Releasing lport 56cebb30-3170-4bdb-bf92-e59e3d4a20ea from this chassis (sb_readonly=0)
Nov 28 11:39:56 np0005538960 ovn_controller[95460]: 2025-11-28T16:39:56Z|00246|binding|INFO|Setting lport 56cebb30-3170-4bdb-bf92-e59e3d4a20ea down in Southbound
Nov 28 11:39:56 np0005538960 ovn_controller[95460]: 2025-11-28T16:39:56Z|00247|binding|INFO|Removing iface tap56cebb30-31 ovn-installed in OVS
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.683 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.698 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:56 np0005538960 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000035.scope: Deactivated successfully.
Nov 28 11:39:56 np0005538960 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000035.scope: Consumed 17.799s CPU time.
Nov 28 11:39:56 np0005538960 systemd-machined[153518]: Machine qemu-19-instance-00000035 terminated.
Nov 28 11:39:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:56.761 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:5f:7b 10.100.0.11 2001:db8:0:1:f816:3eff:fe46:5f7b 2001:db8::f816:3eff:fe46:5f7b'], port_security=['fa:16:3e:46:5f:7b 10.100.0.11 2001:db8:0:1:f816:3eff:fe46:5f7b 2001:db8::f816:3eff:fe46:5f7b'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28 2001:db8:0:1:f816:3eff:fe46:5f7b/64 2001:db8::f816:3eff:fe46:5f7b/64', 'neutron:device_id': '95f49283-16da-4af6-b1a2-acba779ab5e4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '4', 'neutron:security_group_ids': '30d32d94-06c7-4d64-991b-715c9f46c5ae', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=17822ff8-9a5b-4363-9dcc-def524fd1d85, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>], logical_port=56cebb30-3170-4bdb-bf92-e59e3d4a20ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f087f1f7940>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:39:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:56.762 104369 INFO neutron.agent.ovn.metadata.agent [-] Port 56cebb30-3170-4bdb-bf92-e59e3d4a20ea in datapath af1be1f9-0fe9-4225-a7a3-625e96eb67d3 unbound from our chassis#033[00m
Nov 28 11:39:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:56.763 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af1be1f9-0fe9-4225-a7a3-625e96eb67d3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:39:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:56.765 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[780c3d86-4980-4dc4-9171-a6fff603fca9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:39:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:56.766 104369 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3 namespace which is not needed anymore#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.875 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.880 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:56 np0005538960 neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3[225255]: [NOTICE]   (225259) : haproxy version is 2.8.14-c23fe91
Nov 28 11:39:56 np0005538960 neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3[225255]: [NOTICE]   (225259) : path to executable is /usr/sbin/haproxy
Nov 28 11:39:56 np0005538960 neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3[225255]: [WARNING]  (225259) : Exiting Master process...
Nov 28 11:39:56 np0005538960 neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3[225255]: [ALERT]    (225259) : Current worker (225261) exited with code 143 (Terminated)
Nov 28 11:39:56 np0005538960 neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3[225255]: [WARNING]  (225259) : All workers exited. Exiting... (0)
Nov 28 11:39:56 np0005538960 systemd[1]: libpod-90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2.scope: Deactivated successfully.
Nov 28 11:39:56 np0005538960 podman[225750]: 2025-11-28 16:39:56.909889342 +0000 UTC m=+0.053282629 container died 90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.920 187256 INFO nova.virt.libvirt.driver [-] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Instance destroyed successfully.#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.920 187256 DEBUG nova.objects.instance [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lazy-loading 'resources' on Instance uuid 95f49283-16da-4af6-b1a2-acba779ab5e4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Nov 28 11:39:56 np0005538960 systemd[1]: var-lib-containers-storage-overlay-22d4116022170ee938921a77ff47961e9e29a5772c9e793614d56244b9f1c46b-merged.mount: Deactivated successfully.
Nov 28 11:39:56 np0005538960 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2-userdata-shm.mount: Deactivated successfully.
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.947 187256 DEBUG nova.virt.libvirt.vif [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-11-28T16:38:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1667107488',display_name='tempest-TestGettingAddress-server-1667107488',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1667107488',id=53,image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOXqe3Ilt1rqiRPLA/CKrfLDll5E0YCx6n+wUQEHt+Ya18tmZcz7kF+TnTzGspdxJJcrmGDhBmx5/KTuXA2lK6EhBo1O/JZqueRg/YORf1eKURqrPFu///5Uh18nq5Mg+A==',key_name='tempest-TestGettingAddress-1176227644',keypairs=<?>,launch_index=0,launched_at=2025-11-28T16:38:32Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b5f802fe6e0b4d62bba6143515207a40',ramdisk_id='',reservation_id='r-lwaplynr',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='48a87826-de14-4dde-9157-9baf2160cd7d',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-2054466537',owner_user_name='tempest-TestGettingAddress-2054466537-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-11-28T16:38:32Z,user_data=None,user_id='23b8e0c173df4c2883fccd8cb472e427',uuid=95f49283-16da-4af6-b1a2-acba779ab5e4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.947 187256 DEBUG nova.network.os_vif_util [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converting VIF {"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.949 187256 DEBUG nova.network.os_vif_util [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:5f:7b,bridge_name='br-int',has_traffic_filtering=True,id=56cebb30-3170-4bdb-bf92-e59e3d4a20ea,network=Network(af1be1f9-0fe9-4225-a7a3-625e96eb67d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56cebb30-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.949 187256 DEBUG os_vif [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:5f:7b,bridge_name='br-int',has_traffic_filtering=True,id=56cebb30-3170-4bdb-bf92-e59e3d4a20ea,network=Network(af1be1f9-0fe9-4225-a7a3-625e96eb67d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56cebb30-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.951 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.952 187256 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56cebb30-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.954 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.957 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Nov 28 11:39:56 np0005538960 podman[225750]: 2025-11-28 16:39:56.961056798 +0000 UTC m=+0.104450085 container cleanup 90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.961 187256 INFO os_vif [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:5f:7b,bridge_name='br-int',has_traffic_filtering=True,id=56cebb30-3170-4bdb-bf92-e59e3d4a20ea,network=Network(af1be1f9-0fe9-4225-a7a3-625e96eb67d3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56cebb30-31')#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.962 187256 INFO nova.virt.libvirt.driver [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Deleting instance files /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4_del#033[00m
Nov 28 11:39:56 np0005538960 nova_compute[187252]: 2025-11-28 16:39:56.963 187256 INFO nova.virt.libvirt.driver [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Deletion of /var/lib/nova/instances/95f49283-16da-4af6-b1a2-acba779ab5e4_del complete#033[00m
Nov 28 11:39:56 np0005538960 systemd[1]: libpod-conmon-90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2.scope: Deactivated successfully.
Nov 28 11:39:57 np0005538960 podman[225793]: 2025-11-28 16:39:57.039662592 +0000 UTC m=+0.050848149 container remove 90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125)
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.045 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[68921b72-fa17-4cbd-8eb8-f68160598c88]: (4, ('Fri Nov 28 04:39:56 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3 (90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2)\n90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2\nFri Nov 28 04:39:56 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3 (90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2)\n90c10dfa8042ba3abaee45dbee02cb0d7eeeea302a1d762004aca31b1bd7f3d2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.047 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[09150ff0-baf8-4fe8-ae19-a354eaa3b298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.048 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf1be1f9-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:39:57 np0005538960 nova_compute[187252]: 2025-11-28 16:39:57.050 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:57 np0005538960 kernel: tapaf1be1f9-00: left promiscuous mode
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.055 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[241bcc33-b0a4-4892-9d5d-afeddfb5283b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:39:57 np0005538960 nova_compute[187252]: 2025-11-28 16:39:57.064 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.074 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[8a7ab2bb-c2ad-43f9-b214-634fa5180963]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.075 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[a5bd3074-b905-45fd-811a-e73590d02001]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.094 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[31a4f788-3476-41f5-88f5-083e3228c9a8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 501049, 'reachable_time': 37460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225808, 'error': None, 'target': 'ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:39:57 np0005538960 systemd[1]: run-netns-ovnmeta\x2daf1be1f9\x2d0fe9\x2d4225\x2da7a3\x2d625e96eb67d3.mount: Deactivated successfully.
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.098 104482 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-af1be1f9-0fe9-4225-a7a3-625e96eb67d3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Nov 28 11:39:57 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:39:57.098 104482 DEBUG oslo.privsep.daemon [-] privsep: reply[6e8b4416-fa33-40a4-8ae0-9d4db8db572f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:39:57 np0005538960 nova_compute[187252]: 2025-11-28 16:39:57.112 187256 INFO nova.compute.manager [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Took 0.46 seconds to destroy the instance on the hypervisor.#033[00m
Nov 28 11:39:57 np0005538960 nova_compute[187252]: 2025-11-28 16:39:57.113 187256 DEBUG oslo.service.loopingcall [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Nov 28 11:39:57 np0005538960 nova_compute[187252]: 2025-11-28 16:39:57.113 187256 DEBUG nova.compute.manager [-] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Nov 28 11:39:57 np0005538960 nova_compute[187252]: 2025-11-28 16:39:57.113 187256 DEBUG nova.network.neutron [-] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Nov 28 11:39:58 np0005538960 nova_compute[187252]: 2025-11-28 16:39:58.167 187256 DEBUG nova.compute.manager [req-dfa93de1-8f69-496d-b33c-4137348bf85b req-a06eac60-5aec-4a27-8384-1369756eefa1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-vif-unplugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:39:58 np0005538960 nova_compute[187252]: 2025-11-28 16:39:58.169 187256 DEBUG oslo_concurrency.lockutils [req-dfa93de1-8f69-496d-b33c-4137348bf85b req-a06eac60-5aec-4a27-8384-1369756eefa1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:39:58 np0005538960 nova_compute[187252]: 2025-11-28 16:39:58.170 187256 DEBUG oslo_concurrency.lockutils [req-dfa93de1-8f69-496d-b33c-4137348bf85b req-a06eac60-5aec-4a27-8384-1369756eefa1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:39:58 np0005538960 nova_compute[187252]: 2025-11-28 16:39:58.170 187256 DEBUG oslo_concurrency.lockutils [req-dfa93de1-8f69-496d-b33c-4137348bf85b req-a06eac60-5aec-4a27-8384-1369756eefa1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:39:58 np0005538960 nova_compute[187252]: 2025-11-28 16:39:58.170 187256 DEBUG nova.compute.manager [req-dfa93de1-8f69-496d-b33c-4137348bf85b req-a06eac60-5aec-4a27-8384-1369756eefa1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] No waiting events found dispatching network-vif-unplugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:39:58 np0005538960 nova_compute[187252]: 2025-11-28 16:39:58.170 187256 DEBUG nova.compute.manager [req-dfa93de1-8f69-496d-b33c-4137348bf85b req-a06eac60-5aec-4a27-8384-1369756eefa1 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-vif-unplugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Nov 28 11:39:58 np0005538960 nova_compute[187252]: 2025-11-28 16:39:58.949 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:00 np0005538960 podman[225809]: 2025-11-28 16:40:00.193631223 +0000 UTC m=+0.095372644 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.307 187256 DEBUG nova.compute.manager [req-9b63891c-cc86-4a7e-8c9b-0420a8842c16 req-abb510a7-a9b7-4bae-9ddc-ad731763d30b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.308 187256 DEBUG oslo_concurrency.lockutils [req-9b63891c-cc86-4a7e-8c9b-0420a8842c16 req-abb510a7-a9b7-4bae-9ddc-ad731763d30b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Acquiring lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.308 187256 DEBUG oslo_concurrency.lockutils [req-9b63891c-cc86-4a7e-8c9b-0420a8842c16 req-abb510a7-a9b7-4bae-9ddc-ad731763d30b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.308 187256 DEBUG oslo_concurrency.lockutils [req-9b63891c-cc86-4a7e-8c9b-0420a8842c16 req-abb510a7-a9b7-4bae-9ddc-ad731763d30b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.308 187256 DEBUG nova.compute.manager [req-9b63891c-cc86-4a7e-8c9b-0420a8842c16 req-abb510a7-a9b7-4bae-9ddc-ad731763d30b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] No waiting events found dispatching network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.308 187256 WARNING nova.compute.manager [req-9b63891c-cc86-4a7e-8c9b-0420a8842c16 req-abb510a7-a9b7-4bae-9ddc-ad731763d30b 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received unexpected event network-vif-plugged-56cebb30-3170-4bdb-bf92-e59e3d4a20ea for instance with vm_state active and task_state deleting.#033[00m
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.914 187256 DEBUG nova.network.neutron [-] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.961 187256 INFO nova.compute.manager [-] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Took 3.85 seconds to deallocate network for instance.#033[00m
Nov 28 11:40:00 np0005538960 nova_compute[187252]: 2025-11-28 16:40:00.985 187256 DEBUG nova.compute.manager [req-8d6699ee-4b36-4f4d-8c8c-80f9c3d86423 req-cb653b0b-bd9b-4fd1-9779-d9d4ea605f27 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Received event network-vif-deleted-56cebb30-3170-4bdb-bf92-e59e3d4a20ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.034 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.034 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.069 187256 DEBUG nova.scheduler.client.report [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.089 187256 DEBUG nova.scheduler.client.report [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.090 187256 DEBUG nova.compute.provider_tree [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.106 187256 DEBUG nova.scheduler.client.report [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.109 187256 DEBUG nova.network.neutron [req-a1b42bfb-9d26-4634-82e1-9e9be8341bf3 req-e0299907-2a44-4d07-87f0-360bc41d573f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updated VIF entry in instance network info cache for port 56cebb30-3170-4bdb-bf92-e59e3d4a20ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.110 187256 DEBUG nova.network.neutron [req-a1b42bfb-9d26-4634-82e1-9e9be8341bf3 req-e0299907-2a44-4d07-87f0-360bc41d573f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Updating instance_info_cache with network_info: [{"id": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "address": "fa:16:3e:46:5f:7b", "network": {"id": "af1be1f9-0fe9-4225-a7a3-625e96eb67d3", "bridge": "br-int", "label": "tempest-network-smoke--2011467037", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe46:5f7b", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "b5f802fe6e0b4d62bba6143515207a40", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56cebb30-31", "ovs_interfaceid": "56cebb30-3170-4bdb-bf92-e59e3d4a20ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.126 187256 DEBUG nova.scheduler.client.report [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.131 187256 DEBUG oslo_concurrency.lockutils [req-a1b42bfb-9d26-4634-82e1-9e9be8341bf3 req-e0299907-2a44-4d07-87f0-360bc41d573f 02182d255e664cc897ebc8d3a3299c4e 195a4dfcc6bc469d9514ec3d7e7ad833 - - default default] Releasing lock "refresh_cache-95f49283-16da-4af6-b1a2-acba779ab5e4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.199 187256 DEBUG nova.compute.provider_tree [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.213 187256 DEBUG nova.scheduler.client.report [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.244 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.210s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.289 187256 INFO nova.scheduler.client.report [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Deleted allocations for instance 95f49283-16da-4af6-b1a2-acba779ab5e4#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.406 187256 DEBUG oslo_concurrency.lockutils [None req-40a9bca3-3a0f-46d9-bd6d-c6c0b34b0dca 23b8e0c173df4c2883fccd8cb472e427 b5f802fe6e0b4d62bba6143515207a40 - - default default] Lock "95f49283-16da-4af6-b1a2-acba779ab5e4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:40:01 np0005538960 nova_compute[187252]: 2025-11-28 16:40:01.955 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:03 np0005538960 podman[225836]: 2025-11-28 16:40:03.160781704 +0000 UTC m=+0.059231150 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Nov 28 11:40:03 np0005538960 podman[225835]: 2025-11-28 16:40:03.202148349 +0000 UTC m=+0.102756628 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:40:03 np0005538960 nova_compute[187252]: 2025-11-28 16:40:03.952 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:06.364 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:40:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:06.364 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:40:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:06.365 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:40:06 np0005538960 nova_compute[187252]: 2025-11-28 16:40:06.960 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:08 np0005538960 podman[225876]: 2025-11-28 16:40:08.167781211 +0000 UTC m=+0.066054289 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:40:08 np0005538960 nova_compute[187252]: 2025-11-28 16:40:08.953 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:10 np0005538960 podman[225902]: 2025-11-28 16:40:10.166892906 +0000 UTC m=+0.063009193 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350)
Nov 28 11:40:11 np0005538960 nova_compute[187252]: 2025-11-28 16:40:11.917 187256 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1764347996.9148169, 95f49283-16da-4af6-b1a2-acba779ab5e4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Nov 28 11:40:11 np0005538960 nova_compute[187252]: 2025-11-28 16:40:11.918 187256 INFO nova.compute.manager [-] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] VM Stopped (Lifecycle Event)#033[00m
Nov 28 11:40:11 np0005538960 nova_compute[187252]: 2025-11-28 16:40:11.940 187256 DEBUG nova.compute.manager [None req-ea2dbd67-e5fb-47a3-b268-28f475467c11 - - - - - -] [instance: 95f49283-16da-4af6-b1a2-acba779ab5e4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Nov 28 11:40:11 np0005538960 nova_compute[187252]: 2025-11-28 16:40:11.963 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:13 np0005538960 nova_compute[187252]: 2025-11-28 16:40:13.959 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:16 np0005538960 nova_compute[187252]: 2025-11-28 16:40:16.968 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:18 np0005538960 nova_compute[187252]: 2025-11-28 16:40:18.962 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:19 np0005538960 podman[225924]: 2025-11-28 16:40:19.172943366 +0000 UTC m=+0.072601570 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 11:40:21 np0005538960 nova_compute[187252]: 2025-11-28 16:40:21.972 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:23 np0005538960 nova_compute[187252]: 2025-11-28 16:40:23.965 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:25 np0005538960 podman[225945]: 2025-11-28 16:40:25.147843707 +0000 UTC m=+0.054850921 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:40:26 np0005538960 nova_compute[187252]: 2025-11-28 16:40:26.975 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:28 np0005538960 nova_compute[187252]: 2025-11-28 16:40:28.968 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:29 np0005538960 nova_compute[187252]: 2025-11-28 16:40:29.471 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:29 np0005538960 nova_compute[187252]: 2025-11-28 16:40:29.575 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:31 np0005538960 podman[225971]: 2025-11-28 16:40:31.227357261 +0000 UTC m=+0.135553782 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.build-date=20251125, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Nov 28 11:40:31 np0005538960 nova_compute[187252]: 2025-11-28 16:40:31.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:31 np0005538960 nova_compute[187252]: 2025-11-28 16:40:31.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:40:31 np0005538960 nova_compute[187252]: 2025-11-28 16:40:31.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:40:31 np0005538960 nova_compute[187252]: 2025-11-28 16:40:31.333 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:40:31 np0005538960 nova_compute[187252]: 2025-11-28 16:40:31.334 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:31 np0005538960 nova_compute[187252]: 2025-11-28 16:40:31.977 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:33 np0005538960 nova_compute[187252]: 2025-11-28 16:40:33.317 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:33 np0005538960 nova_compute[187252]: 2025-11-28 16:40:33.317 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:40:33 np0005538960 nova_compute[187252]: 2025-11-28 16:40:33.970 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:34 np0005538960 podman[225999]: 2025-11-28 16:40:34.156008217 +0000 UTC m=+0.062890150 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 11:40:34 np0005538960 podman[226000]: 2025-11-28 16:40:34.179212992 +0000 UTC m=+0.082447835 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Nov 28 11:40:34 np0005538960 nova_compute[187252]: 2025-11-28 16:40:34.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:34 np0005538960 nova_compute[187252]: 2025-11-28 16:40:34.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:34 np0005538960 nova_compute[187252]: 2025-11-28 16:40:34.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.325 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.326 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.327 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.328 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:40:35.329 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:40:36 np0005538960 nova_compute[187252]: 2025-11-28 16:40:36.982 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.341 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.341 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.341 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.341 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.522 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.524 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5759MB free_disk=73.33732986450195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.524 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.524 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.594 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.594 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.625 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.637 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.653 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:40:37 np0005538960 nova_compute[187252]: 2025-11-28 16:40:37.654 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:40:38 np0005538960 nova_compute[187252]: 2025-11-28 16:40:38.973 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:39 np0005538960 podman[226037]: 2025-11-28 16:40:39.163117718 +0000 UTC m=+0.069496204 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:40:39 np0005538960 nova_compute[187252]: 2025-11-28 16:40:39.654 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:40 np0005538960 nova_compute[187252]: 2025-11-28 16:40:40.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:40:41 np0005538960 podman[226063]: 2025-11-28 16:40:41.146371459 +0000 UTC m=+0.052340909 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, version=9.6, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:40:41 np0005538960 nova_compute[187252]: 2025-11-28 16:40:41.985 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:43 np0005538960 nova_compute[187252]: 2025-11-28 16:40:43.976 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:46 np0005538960 nova_compute[187252]: 2025-11-28 16:40:46.988 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:48 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:48.786 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a1:de:2d 10.100.0.2 2001:db8::f816:3eff:fea1:de2d'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fea1:de2d/64', 'neutron:device_id': 'ovnmeta-9b00c97a-52c0-4456-bf63-836f659e3d80', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b00c97a-52c0-4456-bf63-836f659e3d80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d87d0e98-0db3-4a07-bd0e-19d40991582e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=67d6dd05-4de9-4054-9ae2-817f35b51d39) old=Port_Binding(mac=['fa:16:3e:a1:de:2d 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-9b00c97a-52c0-4456-bf63-836f659e3d80', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9b00c97a-52c0-4456-bf63-836f659e3d80', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b5f802fe6e0b4d62bba6143515207a40', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:40:48 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:48.788 104369 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 67d6dd05-4de9-4054-9ae2-817f35b51d39 in datapath 9b00c97a-52c0-4456-bf63-836f659e3d80 updated#033[00m
Nov 28 11:40:48 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:48.789 104369 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9b00c97a-52c0-4456-bf63-836f659e3d80, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Nov 28 11:40:48 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:48.791 214244 DEBUG oslo.privsep.daemon [-] privsep: reply[1d765e02-e1ec-4a72-8562-f74304da9312]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Nov 28 11:40:49 np0005538960 nova_compute[187252]: 2025-11-28 16:40:49.005 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:50 np0005538960 podman[226082]: 2025-11-28 16:40:50.169937364 +0000 UTC m=+0.079240405 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:40:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:50.978 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:40:50 np0005538960 nova_compute[187252]: 2025-11-28 16:40:50.979 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:50 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:50.979 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:40:51 np0005538960 nova_compute[187252]: 2025-11-28 16:40:51.991 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:53 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:40:53.982 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:40:54 np0005538960 nova_compute[187252]: 2025-11-28 16:40:54.014 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:56 np0005538960 podman[226103]: 2025-11-28 16:40:56.162422591 +0000 UTC m=+0.069043603 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:40:56 np0005538960 nova_compute[187252]: 2025-11-28 16:40:56.995 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:40:59 np0005538960 nova_compute[187252]: 2025-11-28 16:40:59.009 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:02 np0005538960 nova_compute[187252]: 2025-11-28 16:41:01.999 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:02 np0005538960 podman[226128]: 2025-11-28 16:41:02.169177401 +0000 UTC m=+0.080034995 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Nov 28 11:41:04 np0005538960 nova_compute[187252]: 2025-11-28 16:41:04.012 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:05 np0005538960 podman[226155]: 2025-11-28 16:41:05.144833253 +0000 UTC m=+0.051088377 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Nov 28 11:41:05 np0005538960 podman[226154]: 2025-11-28 16:41:05.150607846 +0000 UTC m=+0.060328116 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Nov 28 11:41:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:41:06.365 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:41:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:41:06.365 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:41:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:41:06.365 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:41:07 np0005538960 nova_compute[187252]: 2025-11-28 16:41:07.002 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:09 np0005538960 nova_compute[187252]: 2025-11-28 16:41:09.014 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:10 np0005538960 podman[226194]: 2025-11-28 16:41:10.143892345 +0000 UTC m=+0.051084118 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:41:12 np0005538960 nova_compute[187252]: 2025-11-28 16:41:12.005 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:12 np0005538960 podman[226219]: 2025-11-28 16:41:12.158413802 +0000 UTC m=+0.064914430 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:41:14 np0005538960 nova_compute[187252]: 2025-11-28 16:41:14.024 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:14 np0005538960 ovn_controller[95460]: 2025-11-28T16:41:14Z|00248|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Nov 28 11:41:17 np0005538960 nova_compute[187252]: 2025-11-28 16:41:17.008 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:19 np0005538960 nova_compute[187252]: 2025-11-28 16:41:19.027 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:21 np0005538960 podman[226241]: 2025-11-28 16:41:21.148738602 +0000 UTC m=+0.058631765 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 11:41:22 np0005538960 nova_compute[187252]: 2025-11-28 16:41:22.011 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:24 np0005538960 nova_compute[187252]: 2025-11-28 16:41:24.029 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:26 np0005538960 nova_compute[187252]: 2025-11-28 16:41:26.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:26 np0005538960 podman[226262]: 2025-11-28 16:41:26.860985681 +0000 UTC m=+0.049912709 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:41:27 np0005538960 nova_compute[187252]: 2025-11-28 16:41:27.014 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:29 np0005538960 nova_compute[187252]: 2025-11-28 16:41:29.032 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:31 np0005538960 nova_compute[187252]: 2025-11-28 16:41:31.325 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:31 np0005538960 nova_compute[187252]: 2025-11-28 16:41:31.326 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:41:31 np0005538960 nova_compute[187252]: 2025-11-28 16:41:31.326 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:41:31 np0005538960 nova_compute[187252]: 2025-11-28 16:41:31.338 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:41:32 np0005538960 nova_compute[187252]: 2025-11-28 16:41:32.017 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:33 np0005538960 podman[226287]: 2025-11-28 16:41:33.186815141 +0000 UTC m=+0.096124824 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Nov 28 11:41:33 np0005538960 nova_compute[187252]: 2025-11-28 16:41:33.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:33 np0005538960 nova_compute[187252]: 2025-11-28 16:41:33.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:33 np0005538960 nova_compute[187252]: 2025-11-28 16:41:33.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:41:34 np0005538960 nova_compute[187252]: 2025-11-28 16:41:34.035 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:34 np0005538960 nova_compute[187252]: 2025-11-28 16:41:34.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:34 np0005538960 nova_compute[187252]: 2025-11-28 16:41:34.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:36 np0005538960 podman[226316]: 2025-11-28 16:41:36.158297989 +0000 UTC m=+0.058375367 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 11:41:36 np0005538960 podman[226315]: 2025-11-28 16:41:36.161580131 +0000 UTC m=+0.066065548 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Nov 28 11:41:36 np0005538960 nova_compute[187252]: 2025-11-28 16:41:36.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:36 np0005538960 nova_compute[187252]: 2025-11-28 16:41:36.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:36 np0005538960 nova_compute[187252]: 2025-11-28 16:41:36.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 11:41:36 np0005538960 nova_compute[187252]: 2025-11-28 16:41:36.336 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 11:41:37 np0005538960 nova_compute[187252]: 2025-11-28 16:41:37.021 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:38 np0005538960 nova_compute[187252]: 2025-11-28 16:41:38.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.039 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.347 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.348 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.348 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.348 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.506 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.507 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5764MB free_disk=73.33732986450195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.507 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.507 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.568 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.569 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.654 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.667 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.669 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:41:39 np0005538960 nova_compute[187252]: 2025-11-28 16:41:39.669 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:41:41 np0005538960 podman[226353]: 2025-11-28 16:41:41.165643766 +0000 UTC m=+0.066539821 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:41:41 np0005538960 nova_compute[187252]: 2025-11-28 16:41:41.670 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:42 np0005538960 nova_compute[187252]: 2025-11-28 16:41:42.024 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:42 np0005538960 nova_compute[187252]: 2025-11-28 16:41:42.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:43 np0005538960 podman[226377]: 2025-11-28 16:41:43.141004912 +0000 UTC m=+0.051762413 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, distribution-scope=public)
Nov 28 11:41:44 np0005538960 nova_compute[187252]: 2025-11-28 16:41:44.041 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:47 np0005538960 nova_compute[187252]: 2025-11-28 16:41:47.027 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:47 np0005538960 nova_compute[187252]: 2025-11-28 16:41:47.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:41:47 np0005538960 nova_compute[187252]: 2025-11-28 16:41:47.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 11:41:49 np0005538960 nova_compute[187252]: 2025-11-28 16:41:49.043 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:41:51.987 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:41:51 np0005538960 nova_compute[187252]: 2025-11-28 16:41:51.987 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:41:51.989 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:41:51 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:41:51.990 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:41:52 np0005538960 nova_compute[187252]: 2025-11-28 16:41:52.029 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:52 np0005538960 podman[226398]: 2025-11-28 16:41:52.159033719 +0000 UTC m=+0.064934890 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:41:54 np0005538960 nova_compute[187252]: 2025-11-28 16:41:54.045 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:57 np0005538960 nova_compute[187252]: 2025-11-28 16:41:57.032 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:41:57 np0005538960 podman[226418]: 2025-11-28 16:41:57.193378535 +0000 UTC m=+0.089161001 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:41:59 np0005538960 nova_compute[187252]: 2025-11-28 16:41:59.049 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:00 np0005538960 nova_compute[187252]: 2025-11-28 16:42:00.437 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:02 np0005538960 nova_compute[187252]: 2025-11-28 16:42:02.036 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:04 np0005538960 nova_compute[187252]: 2025-11-28 16:42:04.052 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:04 np0005538960 podman[226442]: 2025-11-28 16:42:04.218323295 +0000 UTC m=+0.116529109 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:42:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:42:06.366 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:42:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:42:06.367 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:42:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:42:06.367 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:42:06 np0005538960 podman[226467]: 2025-11-28 16:42:06.459304837 +0000 UTC m=+0.061106906 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 11:42:06 np0005538960 podman[226466]: 2025-11-28 16:42:06.460721222 +0000 UTC m=+0.066766976 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=multipathd)
Nov 28 11:42:07 np0005538960 nova_compute[187252]: 2025-11-28 16:42:07.039 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:09 np0005538960 nova_compute[187252]: 2025-11-28 16:42:09.053 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:12 np0005538960 nova_compute[187252]: 2025-11-28 16:42:12.042 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:12 np0005538960 podman[226503]: 2025-11-28 16:42:12.16809467 +0000 UTC m=+0.069143785 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:42:14 np0005538960 nova_compute[187252]: 2025-11-28 16:42:14.088 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:14 np0005538960 podman[226528]: 2025-11-28 16:42:14.20410856 +0000 UTC m=+0.078809575 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, release=1755695350, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Nov 28 11:42:17 np0005538960 nova_compute[187252]: 2025-11-28 16:42:17.046 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:19 np0005538960 nova_compute[187252]: 2025-11-28 16:42:19.091 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:22 np0005538960 nova_compute[187252]: 2025-11-28 16:42:22.050 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:23 np0005538960 podman[226551]: 2025-11-28 16:42:23.173228025 +0000 UTC m=+0.073042782 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 11:42:24 np0005538960 nova_compute[187252]: 2025-11-28 16:42:24.093 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:27 np0005538960 nova_compute[187252]: 2025-11-28 16:42:27.053 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:28 np0005538960 podman[226572]: 2025-11-28 16:42:28.150783852 +0000 UTC m=+0.056231595 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:42:29 np0005538960 nova_compute[187252]: 2025-11-28 16:42:29.094 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:32 np0005538960 nova_compute[187252]: 2025-11-28 16:42:32.056 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:32 np0005538960 nova_compute[187252]: 2025-11-28 16:42:32.409 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:33 np0005538960 nova_compute[187252]: 2025-11-28 16:42:33.334 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:33 np0005538960 nova_compute[187252]: 2025-11-28 16:42:33.334 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:42:33 np0005538960 nova_compute[187252]: 2025-11-28 16:42:33.334 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:42:33 np0005538960 nova_compute[187252]: 2025-11-28 16:42:33.352 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:42:33 np0005538960 nova_compute[187252]: 2025-11-28 16:42:33.352 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:34 np0005538960 nova_compute[187252]: 2025-11-28 16:42:34.096 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:34 np0005538960 nova_compute[187252]: 2025-11-28 16:42:34.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:34 np0005538960 nova_compute[187252]: 2025-11-28 16:42:34.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:42:35 np0005538960 podman[226596]: 2025-11-28 16:42:35.189950216 +0000 UTC m=+0.100731769 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:42:35 np0005538960 nova_compute[187252]: 2025-11-28 16:42:35.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:42:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:42:36 np0005538960 nova_compute[187252]: 2025-11-28 16:42:36.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:37 np0005538960 nova_compute[187252]: 2025-11-28 16:42:37.059 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:37 np0005538960 podman[226623]: 2025-11-28 16:42:37.167482816 +0000 UTC m=+0.065702761 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 11:42:37 np0005538960 podman[226622]: 2025-11-28 16:42:37.173873504 +0000 UTC m=+0.075034211 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:42:38 np0005538960 nova_compute[187252]: 2025-11-28 16:42:38.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.097 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.345 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.346 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.347 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.347 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.508 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.509 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5756MB free_disk=73.33732986450195GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.510 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.510 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.573 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.574 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.778 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.798 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.799 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:42:39 np0005538960 nova_compute[187252]: 2025-11-28 16:42:39.800 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:42:42 np0005538960 nova_compute[187252]: 2025-11-28 16:42:42.062 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:43 np0005538960 podman[226660]: 2025-11-28 16:42:43.18648397 +0000 UTC m=+0.084789883 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:42:43 np0005538960 nova_compute[187252]: 2025-11-28 16:42:43.800 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:44 np0005538960 nova_compute[187252]: 2025-11-28 16:42:44.099 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:44 np0005538960 nova_compute[187252]: 2025-11-28 16:42:44.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:42:45 np0005538960 podman[226684]: 2025-11-28 16:42:45.159253932 +0000 UTC m=+0.062260655 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc.)
Nov 28 11:42:47 np0005538960 nova_compute[187252]: 2025-11-28 16:42:47.066 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:49 np0005538960 nova_compute[187252]: 2025-11-28 16:42:49.101 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:52 np0005538960 nova_compute[187252]: 2025-11-28 16:42:52.070 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:54 np0005538960 nova_compute[187252]: 2025-11-28 16:42:54.149 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:54 np0005538960 podman[226705]: 2025-11-28 16:42:54.174552472 +0000 UTC m=+0.081551483 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:42:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:42:56.441 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:42:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:42:56.442 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:42:56 np0005538960 nova_compute[187252]: 2025-11-28 16:42:56.443 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:57 np0005538960 nova_compute[187252]: 2025-11-28 16:42:57.073 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:59 np0005538960 nova_compute[187252]: 2025-11-28 16:42:59.152 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:42:59 np0005538960 podman[226725]: 2025-11-28 16:42:59.183040558 +0000 UTC m=+0.082281111 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:43:02 np0005538960 nova_compute[187252]: 2025-11-28 16:43:02.076 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:03 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:43:03.446 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:43:04 np0005538960 nova_compute[187252]: 2025-11-28 16:43:04.154 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:06 np0005538960 podman[226752]: 2025-11-28 16:43:06.20320362 +0000 UTC m=+0.094796352 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125)
Nov 28 11:43:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:43:06.369 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:43:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:43:06.369 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:43:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:43:06.369 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:43:07 np0005538960 nova_compute[187252]: 2025-11-28 16:43:07.080 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:08 np0005538960 podman[226779]: 2025-11-28 16:43:08.168469136 +0000 UTC m=+0.055934238 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 11:43:08 np0005538960 podman[226778]: 2025-11-28 16:43:08.205105524 +0000 UTC m=+0.097467197 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:43:09 np0005538960 nova_compute[187252]: 2025-11-28 16:43:09.157 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:12 np0005538960 nova_compute[187252]: 2025-11-28 16:43:12.083 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:14 np0005538960 podman[226821]: 2025-11-28 16:43:14.1410759 +0000 UTC m=+0.052286307 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:43:14 np0005538960 nova_compute[187252]: 2025-11-28 16:43:14.161 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:16 np0005538960 podman[226845]: 2025-11-28 16:43:16.213926843 +0000 UTC m=+0.103750233 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:43:17 np0005538960 nova_compute[187252]: 2025-11-28 16:43:17.087 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:19 np0005538960 nova_compute[187252]: 2025-11-28 16:43:19.163 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:22 np0005538960 nova_compute[187252]: 2025-11-28 16:43:22.092 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:24 np0005538960 nova_compute[187252]: 2025-11-28 16:43:24.166 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:25 np0005538960 podman[226866]: 2025-11-28 16:43:25.154732206 +0000 UTC m=+0.060157683 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:43:27 np0005538960 nova_compute[187252]: 2025-11-28 16:43:27.096 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:29 np0005538960 nova_compute[187252]: 2025-11-28 16:43:29.168 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:30 np0005538960 podman[226888]: 2025-11-28 16:43:30.160783859 +0000 UTC m=+0.060666325 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:43:32 np0005538960 nova_compute[187252]: 2025-11-28 16:43:32.099 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:33 np0005538960 nova_compute[187252]: 2025-11-28 16:43:33.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:33 np0005538960 nova_compute[187252]: 2025-11-28 16:43:33.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:43:33 np0005538960 nova_compute[187252]: 2025-11-28 16:43:33.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:43:33 np0005538960 nova_compute[187252]: 2025-11-28 16:43:33.338 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:43:34 np0005538960 nova_compute[187252]: 2025-11-28 16:43:34.171 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:34 np0005538960 nova_compute[187252]: 2025-11-28 16:43:34.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:35 np0005538960 nova_compute[187252]: 2025-11-28 16:43:35.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:36 np0005538960 nova_compute[187252]: 2025-11-28 16:43:36.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:36 np0005538960 nova_compute[187252]: 2025-11-28 16:43:36.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:43:37 np0005538960 nova_compute[187252]: 2025-11-28 16:43:37.102 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:37 np0005538960 podman[226913]: 2025-11-28 16:43:37.187372751 +0000 UTC m=+0.096265528 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Nov 28 11:43:37 np0005538960 nova_compute[187252]: 2025-11-28 16:43:37.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:39 np0005538960 podman[226941]: 2025-11-28 16:43:39.151271743 +0000 UTC m=+0.057475126 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Nov 28 11:43:39 np0005538960 nova_compute[187252]: 2025-11-28 16:43:39.173 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:39 np0005538960 podman[226942]: 2025-11-28 16:43:39.176813406 +0000 UTC m=+0.081329807 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 11:43:40 np0005538960 nova_compute[187252]: 2025-11-28 16:43:40.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:40 np0005538960 nova_compute[187252]: 2025-11-28 16:43:40.326 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.341 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.341 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.341 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.341 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.499 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.500 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5761MB free_disk=73.33752059936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.501 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.501 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.686 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.687 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.718 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.730 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.732 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:43:41 np0005538960 nova_compute[187252]: 2025-11-28 16:43:41.732 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.231s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:43:42 np0005538960 nova_compute[187252]: 2025-11-28 16:43:42.105 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:43 np0005538960 nova_compute[187252]: 2025-11-28 16:43:43.732 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:44 np0005538960 nova_compute[187252]: 2025-11-28 16:43:44.174 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:45 np0005538960 podman[226980]: 2025-11-28 16:43:45.151752898 +0000 UTC m=+0.062126461 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:43:45 np0005538960 nova_compute[187252]: 2025-11-28 16:43:45.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:43:47 np0005538960 nova_compute[187252]: 2025-11-28 16:43:47.108 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:47 np0005538960 podman[227004]: 2025-11-28 16:43:47.150362011 +0000 UTC m=+0.061900126 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-type=git, io.buildah.version=1.33.7, config_id=edpm, maintainer=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 11:43:49 np0005538960 nova_compute[187252]: 2025-11-28 16:43:49.177 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:52 np0005538960 nova_compute[187252]: 2025-11-28 16:43:52.111 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:54 np0005538960 nova_compute[187252]: 2025-11-28 16:43:54.178 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:56 np0005538960 podman[227025]: 2025-11-28 16:43:56.154972524 +0000 UTC m=+0.058435270 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 11:43:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:43:56.510 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:43:56 np0005538960 nova_compute[187252]: 2025-11-28 16:43:56.511 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:56 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:43:56.512 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:43:57 np0005538960 nova_compute[187252]: 2025-11-28 16:43:57.114 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:43:59 np0005538960 nova_compute[187252]: 2025-11-28 16:43:59.180 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:01 np0005538960 podman[227047]: 2025-11-28 16:44:01.139077145 +0000 UTC m=+0.050652267 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:44:02 np0005538960 nova_compute[187252]: 2025-11-28 16:44:02.117 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:04 np0005538960 nova_compute[187252]: 2025-11-28 16:44:04.182 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:04 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:44:04.515 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:44:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:44:06.370 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:44:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:44:06.371 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:44:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:44:06.371 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:44:07 np0005538960 nova_compute[187252]: 2025-11-28 16:44:07.121 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:08 np0005538960 podman[227073]: 2025-11-28 16:44:08.240227243 +0000 UTC m=+0.130951536 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:44:09 np0005538960 nova_compute[187252]: 2025-11-28 16:44:09.184 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:10 np0005538960 podman[227100]: 2025-11-28 16:44:10.154744332 +0000 UTC m=+0.054701847 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Nov 28 11:44:10 np0005538960 podman[227099]: 2025-11-28 16:44:10.218173735 +0000 UTC m=+0.111605719 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd)
Nov 28 11:44:12 np0005538960 nova_compute[187252]: 2025-11-28 16:44:12.124 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:14 np0005538960 nova_compute[187252]: 2025-11-28 16:44:14.187 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:16 np0005538960 podman[227137]: 2025-11-28 16:44:16.146845469 +0000 UTC m=+0.055152317 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:44:17 np0005538960 nova_compute[187252]: 2025-11-28 16:44:17.127 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:18 np0005538960 podman[227160]: 2025-11-28 16:44:18.159619574 +0000 UTC m=+0.069071443 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, architecture=x86_64, maintainer=Red Hat, Inc., config_id=edpm)
Nov 28 11:44:19 np0005538960 nova_compute[187252]: 2025-11-28 16:44:19.197 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:22 np0005538960 nova_compute[187252]: 2025-11-28 16:44:22.131 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:24 np0005538960 nova_compute[187252]: 2025-11-28 16:44:24.200 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:27 np0005538960 nova_compute[187252]: 2025-11-28 16:44:27.287 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:27 np0005538960 podman[227184]: 2025-11-28 16:44:27.357297604 +0000 UTC m=+0.050943116 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:44:29 np0005538960 nova_compute[187252]: 2025-11-28 16:44:29.201 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:32 np0005538960 podman[227204]: 2025-11-28 16:44:32.175873722 +0000 UTC m=+0.078192482 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:44:32 np0005538960 nova_compute[187252]: 2025-11-28 16:44:32.290 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:34 np0005538960 nova_compute[187252]: 2025-11-28 16:44:34.204 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:34 np0005538960 nova_compute[187252]: 2025-11-28 16:44:34.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:34 np0005538960 nova_compute[187252]: 2025-11-28 16:44:34.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:44:34 np0005538960 nova_compute[187252]: 2025-11-28 16:44:34.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:44:34 np0005538960 nova_compute[187252]: 2025-11-28 16:44:34.327 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.316 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.317 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:44:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:44:36 np0005538960 nova_compute[187252]: 2025-11-28 16:44:36.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:37 np0005538960 nova_compute[187252]: 2025-11-28 16:44:37.296 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:37 np0005538960 nova_compute[187252]: 2025-11-28 16:44:37.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:37 np0005538960 nova_compute[187252]: 2025-11-28 16:44:37.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:37 np0005538960 nova_compute[187252]: 2025-11-28 16:44:37.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:44:39 np0005538960 nova_compute[187252]: 2025-11-28 16:44:39.205 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:39 np0005538960 podman[227228]: 2025-11-28 16:44:39.215047043 +0000 UTC m=+0.114185292 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:44:39 np0005538960 nova_compute[187252]: 2025-11-28 16:44:39.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:41 np0005538960 podman[227255]: 2025-11-28 16:44:41.146659419 +0000 UTC m=+0.054252597 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:44:41 np0005538960 podman[227254]: 2025-11-28 16:44:41.167807726 +0000 UTC m=+0.077823213 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.346 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.346 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.347 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.347 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.525 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.527 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5759MB free_disk=73.33752059936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.527 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.528 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.595 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.596 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.633 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.646 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.647 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.648 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:44:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:44:41.788 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:44:41 np0005538960 nova_compute[187252]: 2025-11-28 16:44:41.789 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:41 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:44:41.790 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:44:42 np0005538960 nova_compute[187252]: 2025-11-28 16:44:42.298 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:42 np0005538960 nova_compute[187252]: 2025-11-28 16:44:42.649 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:44 np0005538960 nova_compute[187252]: 2025-11-28 16:44:44.205 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:45 np0005538960 nova_compute[187252]: 2025-11-28 16:44:45.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:46 np0005538960 podman[227290]: 2025-11-28 16:44:46.457747347 +0000 UTC m=+0.051126432 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:44:47 np0005538960 nova_compute[187252]: 2025-11-28 16:44:47.300 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:47 np0005538960 nova_compute[187252]: 2025-11-28 16:44:47.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:44:47 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:44:47.792 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:44:49 np0005538960 podman[227314]: 2025-11-28 16:44:49.159386036 +0000 UTC m=+0.059250169 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 11:44:49 np0005538960 nova_compute[187252]: 2025-11-28 16:44:49.208 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:52 np0005538960 nova_compute[187252]: 2025-11-28 16:44:52.302 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:54 np0005538960 nova_compute[187252]: 2025-11-28 16:44:54.210 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:57 np0005538960 nova_compute[187252]: 2025-11-28 16:44:57.303 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:44:58 np0005538960 podman[227335]: 2025-11-28 16:44:58.165280661 +0000 UTC m=+0.069300764 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 11:44:59 np0005538960 nova_compute[187252]: 2025-11-28 16:44:59.211 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:02 np0005538960 nova_compute[187252]: 2025-11-28 16:45:02.306 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:03 np0005538960 podman[227355]: 2025-11-28 16:45:03.171018164 +0000 UTC m=+0.077407324 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:45:04 np0005538960 nova_compute[187252]: 2025-11-28 16:45:04.240 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:45:06.371 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:45:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:45:06.371 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:45:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:45:06.372 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:45:07 np0005538960 nova_compute[187252]: 2025-11-28 16:45:07.308 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:09 np0005538960 nova_compute[187252]: 2025-11-28 16:45:09.242 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:10 np0005538960 podman[227379]: 2025-11-28 16:45:10.198929798 +0000 UTC m=+0.109966277 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller)
Nov 28 11:45:12 np0005538960 podman[227407]: 2025-11-28 16:45:12.186243637 +0000 UTC m=+0.080426006 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Nov 28 11:45:12 np0005538960 podman[227406]: 2025-11-28 16:45:12.194233983 +0000 UTC m=+0.086860604 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Nov 28 11:45:12 np0005538960 nova_compute[187252]: 2025-11-28 16:45:12.311 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:14 np0005538960 nova_compute[187252]: 2025-11-28 16:45:14.248 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:17 np0005538960 podman[227442]: 2025-11-28 16:45:17.180120041 +0000 UTC m=+0.075167818 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:45:17 np0005538960 nova_compute[187252]: 2025-11-28 16:45:17.312 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:19 np0005538960 nova_compute[187252]: 2025-11-28 16:45:19.259 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:20 np0005538960 podman[227467]: 2025-11-28 16:45:20.170993831 +0000 UTC m=+0.070376881 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350)
Nov 28 11:45:22 np0005538960 nova_compute[187252]: 2025-11-28 16:45:22.314 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:24 np0005538960 nova_compute[187252]: 2025-11-28 16:45:24.260 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:27 np0005538960 nova_compute[187252]: 2025-11-28 16:45:27.316 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:29 np0005538960 nova_compute[187252]: 2025-11-28 16:45:29.749 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:29 np0005538960 podman[227488]: 2025-11-28 16:45:29.83446758 +0000 UTC m=+0.066859586 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:45:32 np0005538960 nova_compute[187252]: 2025-11-28 16:45:32.318 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:34 np0005538960 podman[227508]: 2025-11-28 16:45:34.175959145 +0000 UTC m=+0.074439470 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:45:34 np0005538960 nova_compute[187252]: 2025-11-28 16:45:34.752 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:36 np0005538960 nova_compute[187252]: 2025-11-28 16:45:36.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:36 np0005538960 nova_compute[187252]: 2025-11-28 16:45:36.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:45:36 np0005538960 nova_compute[187252]: 2025-11-28 16:45:36.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:45:36 np0005538960 nova_compute[187252]: 2025-11-28 16:45:36.333 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:45:37 np0005538960 nova_compute[187252]: 2025-11-28 16:45:37.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:37 np0005538960 nova_compute[187252]: 2025-11-28 16:45:37.318 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:38 np0005538960 nova_compute[187252]: 2025-11-28 16:45:38.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:38 np0005538960 nova_compute[187252]: 2025-11-28 16:45:38.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:38 np0005538960 nova_compute[187252]: 2025-11-28 16:45:38.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:45:39 np0005538960 nova_compute[187252]: 2025-11-28 16:45:39.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:39 np0005538960 nova_compute[187252]: 2025-11-28 16:45:39.752 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:41 np0005538960 podman[227532]: 2025-11-28 16:45:41.186922325 +0000 UTC m=+0.096725545 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125)
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.336 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.337 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.337 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.337 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.500 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.502 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5763MB free_disk=73.33752059936523GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.502 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.503 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.557 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.558 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.572 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.586 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.586 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.608 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.629 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.653 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.666 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.667 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:45:41 np0005538960 nova_compute[187252]: 2025-11-28 16:45:41.668 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:45:42 np0005538960 nova_compute[187252]: 2025-11-28 16:45:42.320 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:42 np0005538960 nova_compute[187252]: 2025-11-28 16:45:42.663 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:43 np0005538960 podman[227561]: 2025-11-28 16:45:43.171121427 +0000 UTC m=+0.056547403 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 11:45:43 np0005538960 podman[227560]: 2025-11-28 16:45:43.184647918 +0000 UTC m=+0.068463155 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd)
Nov 28 11:45:44 np0005538960 nova_compute[187252]: 2025-11-28 16:45:44.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:44 np0005538960 nova_compute[187252]: 2025-11-28 16:45:44.754 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:47 np0005538960 nova_compute[187252]: 2025-11-28 16:45:47.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:47 np0005538960 nova_compute[187252]: 2025-11-28 16:45:47.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:45:47 np0005538960 nova_compute[187252]: 2025-11-28 16:45:47.321 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:48 np0005538960 podman[227598]: 2025-11-28 16:45:48.182499608 +0000 UTC m=+0.083613914 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:45:49 np0005538960 nova_compute[187252]: 2025-11-28 16:45:49.756 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:51 np0005538960 podman[227622]: 2025-11-28 16:45:51.178074334 +0000 UTC m=+0.087786727 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=)
Nov 28 11:45:52 np0005538960 nova_compute[187252]: 2025-11-28 16:45:52.323 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:54 np0005538960 nova_compute[187252]: 2025-11-28 16:45:54.759 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:57 np0005538960 nova_compute[187252]: 2025-11-28 16:45:57.359 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:45:59 np0005538960 systemd-logind[788]: New session 40 of user zuul.
Nov 28 11:45:59 np0005538960 systemd[1]: Started Session 40 of User zuul.
Nov 28 11:45:59 np0005538960 nova_compute[187252]: 2025-11-28 16:45:59.760 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:00 np0005538960 podman[227683]: 2025-11-28 16:46:00.346973453 +0000 UTC m=+0.073040607 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, org.label-schema.schema-version=1.0)
Nov 28 11:46:02 np0005538960 nova_compute[187252]: 2025-11-28 16:46:02.362 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:04 np0005538960 nova_compute[187252]: 2025-11-28 16:46:04.763 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:05 np0005538960 podman[227813]: 2025-11-28 16:46:05.179743717 +0000 UTC m=+0.081081383 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:46:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:46:06.372 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:46:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:46:06.373 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:46:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:46:06.373 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:46:07 np0005538960 nova_compute[187252]: 2025-11-28 16:46:07.365 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:09 np0005538960 nova_compute[187252]: 2025-11-28 16:46:09.766 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:12 np0005538960 ovs-vsctl[227871]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 28 11:46:12 np0005538960 podman[227872]: 2025-11-28 16:46:12.193343491 +0000 UTC m=+0.090560434 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:46:12 np0005538960 nova_compute[187252]: 2025-11-28 16:46:12.367 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:12 np0005538960 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 227673 (sos)
Nov 28 11:46:12 np0005538960 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Nov 28 11:46:12 np0005538960 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Nov 28 11:46:13 np0005538960 virtqemud[186797]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 28 11:46:13 np0005538960 virtqemud[186797]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 28 11:46:13 np0005538960 virtqemud[186797]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 11:46:13 np0005538960 podman[228114]: 2025-11-28 16:46:13.482668869 +0000 UTC m=+0.078194613 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:46:13 np0005538960 podman[228108]: 2025-11-28 16:46:13.512271352 +0000 UTC m=+0.107259092 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Nov 28 11:46:14 np0005538960 nova_compute[187252]: 2025-11-28 16:46:14.767 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:16 np0005538960 systemd[1]: Starting Hostname Service...
Nov 28 11:46:16 np0005538960 systemd[1]: Started Hostname Service.
Nov 28 11:46:17 np0005538960 nova_compute[187252]: 2025-11-28 16:46:17.369 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:18 np0005538960 podman[228691]: 2025-11-28 16:46:18.705619352 +0000 UTC m=+0.061322880 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:46:19 np0005538960 nova_compute[187252]: 2025-11-28 16:46:19.769 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:22 np0005538960 podman[229097]: 2025-11-28 16:46:22.162739819 +0000 UTC m=+0.063305529 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, release=1755695350)
Nov 28 11:46:22 np0005538960 nova_compute[187252]: 2025-11-28 16:46:22.371 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:23 np0005538960 ovs-appctl[229651]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 28 11:46:23 np0005538960 ovs-appctl[229666]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 28 11:46:23 np0005538960 ovs-appctl[229674]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Nov 28 11:46:24 np0005538960 nova_compute[187252]: 2025-11-28 16:46:24.771 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:26 np0005538960 nova_compute[187252]: 2025-11-28 16:46:26.681 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:27 np0005538960 nova_compute[187252]: 2025-11-28 16:46:27.372 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:29 np0005538960 nova_compute[187252]: 2025-11-28 16:46:29.773 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:30 np0005538960 podman[230793]: 2025-11-28 16:46:30.482464361 +0000 UTC m=+0.063452652 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:46:31 np0005538960 virtqemud[186797]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 11:46:32 np0005538960 nova_compute[187252]: 2025-11-28 16:46:32.374 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:33 np0005538960 systemd[1]: Starting Time & Date Service...
Nov 28 11:46:33 np0005538960 systemd[1]: Started Time & Date Service.
Nov 28 11:46:34 np0005538960 nova_compute[187252]: 2025-11-28 16:46:34.776 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:46:35.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:46:35 np0005538960 podman[231246]: 2025-11-28 16:46:35.442473266 +0000 UTC m=+0.083680517 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:46:37 np0005538960 nova_compute[187252]: 2025-11-28 16:46:37.325 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:37 np0005538960 nova_compute[187252]: 2025-11-28 16:46:37.325 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:46:37 np0005538960 nova_compute[187252]: 2025-11-28 16:46:37.326 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:46:37 np0005538960 nova_compute[187252]: 2025-11-28 16:46:37.337 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:46:37 np0005538960 nova_compute[187252]: 2025-11-28 16:46:37.376 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:39 np0005538960 nova_compute[187252]: 2025-11-28 16:46:39.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:39 np0005538960 nova_compute[187252]: 2025-11-28 16:46:39.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:39 np0005538960 nova_compute[187252]: 2025-11-28 16:46:39.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:39 np0005538960 nova_compute[187252]: 2025-11-28 16:46:39.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:46:39 np0005538960 nova_compute[187252]: 2025-11-28 16:46:39.780 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:40 np0005538960 nova_compute[187252]: 2025-11-28 16:46:40.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.342 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.343 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.344 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.344 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.494 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.496 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5373MB free_disk=72.84720230102539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.496 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.496 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.548 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.549 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.581 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.597 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.616 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:46:41 np0005538960 nova_compute[187252]: 2025-11-28 16:46:41.616 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.120s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:46:42 np0005538960 nova_compute[187252]: 2025-11-28 16:46:42.379 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:43 np0005538960 podman[231273]: 2025-11-28 16:46:43.110752863 +0000 UTC m=+0.088536985 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:46:43 np0005538960 podman[231300]: 2025-11-28 16:46:43.918714074 +0000 UTC m=+0.053038458 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:46:43 np0005538960 podman[231299]: 2025-11-28 16:46:43.92186532 +0000 UTC m=+0.058028609 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3)
Nov 28 11:46:44 np0005538960 nova_compute[187252]: 2025-11-28 16:46:44.781 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:45 np0005538960 nova_compute[187252]: 2025-11-28 16:46:45.617 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:47 np0005538960 nova_compute[187252]: 2025-11-28 16:46:47.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:47 np0005538960 nova_compute[187252]: 2025-11-28 16:46:47.381 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:48 np0005538960 nova_compute[187252]: 2025-11-28 16:46:48.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:48 np0005538960 nova_compute[187252]: 2025-11-28 16:46:48.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 11:46:48 np0005538960 nova_compute[187252]: 2025-11-28 16:46:48.330 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 11:46:49 np0005538960 podman[231338]: 2025-11-28 16:46:49.15081755 +0000 UTC m=+0.059356111 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:46:49 np0005538960 nova_compute[187252]: 2025-11-28 16:46:49.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:46:49 np0005538960 nova_compute[187252]: 2025-11-28 16:46:49.786 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:52 np0005538960 nova_compute[187252]: 2025-11-28 16:46:52.383 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:53 np0005538960 podman[231363]: 2025-11-28 16:46:53.179675253 +0000 UTC m=+0.085836119 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:46:54 np0005538960 nova_compute[187252]: 2025-11-28 16:46:54.789 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:57 np0005538960 systemd[1]: session-40.scope: Deactivated successfully.
Nov 28 11:46:57 np0005538960 systemd[1]: session-40.scope: Consumed 1min 22.001s CPU time, 499.7M memory peak, read 101.2M from disk, written 25.6M to disk.
Nov 28 11:46:57 np0005538960 systemd-logind[788]: Session 40 logged out. Waiting for processes to exit.
Nov 28 11:46:57 np0005538960 systemd-logind[788]: Removed session 40.
Nov 28 11:46:57 np0005538960 systemd-logind[788]: New session 41 of user zuul.
Nov 28 11:46:57 np0005538960 nova_compute[187252]: 2025-11-28 16:46:57.385 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:46:57 np0005538960 systemd[1]: Started Session 41 of User zuul.
Nov 28 11:46:57 np0005538960 systemd[1]: session-41.scope: Deactivated successfully.
Nov 28 11:46:57 np0005538960 systemd-logind[788]: Session 41 logged out. Waiting for processes to exit.
Nov 28 11:46:57 np0005538960 systemd-logind[788]: Removed session 41.
Nov 28 11:46:57 np0005538960 systemd-logind[788]: New session 42 of user zuul.
Nov 28 11:46:57 np0005538960 systemd[1]: Started Session 42 of User zuul.
Nov 28 11:46:57 np0005538960 systemd-logind[788]: Session 42 logged out. Waiting for processes to exit.
Nov 28 11:46:57 np0005538960 systemd[1]: session-42.scope: Deactivated successfully.
Nov 28 11:46:57 np0005538960 systemd-logind[788]: Removed session 42.
Nov 28 11:46:59 np0005538960 nova_compute[187252]: 2025-11-28 16:46:59.790 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:00 np0005538960 nova_compute[187252]: 2025-11-28 16:47:00.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:00 np0005538960 nova_compute[187252]: 2025-11-28 16:47:00.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 11:47:01 np0005538960 podman[231443]: 2025-11-28 16:47:01.17681534 +0000 UTC m=+0.080134010 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 11:47:02 np0005538960 nova_compute[187252]: 2025-11-28 16:47:02.420 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:03 np0005538960 systemd[1]: systemd-timedated.service: Deactivated successfully.
Nov 28 11:47:03 np0005538960 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Nov 28 11:47:04 np0005538960 nova_compute[187252]: 2025-11-28 16:47:04.792 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:06 np0005538960 podman[231469]: 2025-11-28 16:47:06.186894339 +0000 UTC m=+0.080117789 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:47:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:47:06.374 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:47:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:47:06.375 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:47:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:47:06.375 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:47:07 np0005538960 nova_compute[187252]: 2025-11-28 16:47:07.422 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:09 np0005538960 nova_compute[187252]: 2025-11-28 16:47:09.794 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:12 np0005538960 nova_compute[187252]: 2025-11-28 16:47:12.424 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:14 np0005538960 podman[231496]: 2025-11-28 16:47:14.182270263 +0000 UTC m=+0.079178757 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:47:14 np0005538960 podman[231495]: 2025-11-28 16:47:14.190407462 +0000 UTC m=+0.090844522 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:47:14 np0005538960 podman[231494]: 2025-11-28 16:47:14.25051335 +0000 UTC m=+0.145964019 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:47:14 np0005538960 nova_compute[187252]: 2025-11-28 16:47:14.795 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:17 np0005538960 nova_compute[187252]: 2025-11-28 16:47:17.425 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:19 np0005538960 nova_compute[187252]: 2025-11-28 16:47:19.797 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:20 np0005538960 podman[231554]: 2025-11-28 16:47:20.196244121 +0000 UTC m=+0.083568493 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:47:22 np0005538960 nova_compute[187252]: 2025-11-28 16:47:22.426 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:24 np0005538960 podman[231579]: 2025-11-28 16:47:24.218323039 +0000 UTC m=+0.110449910 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1755695350)
Nov 28 11:47:24 np0005538960 nova_compute[187252]: 2025-11-28 16:47:24.799 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:27 np0005538960 nova_compute[187252]: 2025-11-28 16:47:27.428 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:29 np0005538960 nova_compute[187252]: 2025-11-28 16:47:29.801 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:32 np0005538960 podman[231600]: 2025-11-28 16:47:32.158677248 +0000 UTC m=+0.062400597 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=edpm)
Nov 28 11:47:32 np0005538960 nova_compute[187252]: 2025-11-28 16:47:32.431 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:34 np0005538960 nova_compute[187252]: 2025-11-28 16:47:34.804 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:37 np0005538960 podman[231620]: 2025-11-28 16:47:37.195268544 +0000 UTC m=+0.089024627 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:47:37 np0005538960 nova_compute[187252]: 2025-11-28 16:47:37.433 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:39 np0005538960 nova_compute[187252]: 2025-11-28 16:47:39.332 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:39 np0005538960 nova_compute[187252]: 2025-11-28 16:47:39.333 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:47:39 np0005538960 nova_compute[187252]: 2025-11-28 16:47:39.333 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:47:39 np0005538960 nova_compute[187252]: 2025-11-28 16:47:39.351 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:47:39 np0005538960 nova_compute[187252]: 2025-11-28 16:47:39.351 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:39 np0005538960 nova_compute[187252]: 2025-11-28 16:47:39.351 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:47:39 np0005538960 nova_compute[187252]: 2025-11-28 16:47:39.806 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:40 np0005538960 nova_compute[187252]: 2025-11-28 16:47:40.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.354 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.355 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.356 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.356 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.527 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.528 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5709MB free_disk=73.33726119995117GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.528 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.528 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.609 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.609 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.804 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.836 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.865 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:47:41 np0005538960 nova_compute[187252]: 2025-11-28 16:47:41.866 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:47:42 np0005538960 nova_compute[187252]: 2025-11-28 16:47:42.436 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:43 np0005538960 nova_compute[187252]: 2025-11-28 16:47:43.863 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:44 np0005538960 nova_compute[187252]: 2025-11-28 16:47:44.808 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:45 np0005538960 podman[231645]: 2025-11-28 16:47:45.162842318 +0000 UTC m=+0.066878194 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:47:45 np0005538960 podman[231646]: 2025-11-28 16:47:45.163134596 +0000 UTC m=+0.064373654 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 11:47:45 np0005538960 podman[231644]: 2025-11-28 16:47:45.215135597 +0000 UTC m=+0.123250563 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Nov 28 11:47:47 np0005538960 nova_compute[187252]: 2025-11-28 16:47:47.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:47 np0005538960 nova_compute[187252]: 2025-11-28 16:47:47.438 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:49 np0005538960 nova_compute[187252]: 2025-11-28 16:47:49.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:49 np0005538960 nova_compute[187252]: 2025-11-28 16:47:49.810 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:50 np0005538960 nova_compute[187252]: 2025-11-28 16:47:50.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:47:51 np0005538960 podman[231707]: 2025-11-28 16:47:51.140822336 +0000 UTC m=+0.050260699 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:47:52 np0005538960 nova_compute[187252]: 2025-11-28 16:47:52.440 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:54 np0005538960 nova_compute[187252]: 2025-11-28 16:47:54.811 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:55 np0005538960 podman[231733]: 2025-11-28 16:47:55.141313995 +0000 UTC m=+0.050021344 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, version=9.6, managed_by=edpm_ansible, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc.)
Nov 28 11:47:57 np0005538960 nova_compute[187252]: 2025-11-28 16:47:57.442 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:47:59 np0005538960 nova_compute[187252]: 2025-11-28 16:47:59.813 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:02 np0005538960 nova_compute[187252]: 2025-11-28 16:48:02.444 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:03 np0005538960 podman[231753]: 2025-11-28 16:48:03.159400763 +0000 UTC m=+0.065271236 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Nov 28 11:48:04 np0005538960 nova_compute[187252]: 2025-11-28 16:48:04.814 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:48:06.376 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:48:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:48:06.377 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:48:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:48:06.377 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:48:07 np0005538960 nova_compute[187252]: 2025-11-28 16:48:07.445 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:08 np0005538960 podman[231776]: 2025-11-28 16:48:08.136360992 +0000 UTC m=+0.043591806 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:48:09 np0005538960 nova_compute[187252]: 2025-11-28 16:48:09.815 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:12 np0005538960 nova_compute[187252]: 2025-11-28 16:48:12.447 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:14 np0005538960 nova_compute[187252]: 2025-11-28 16:48:14.819 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:16 np0005538960 podman[231802]: 2025-11-28 16:48:16.175407565 +0000 UTC m=+0.076277305 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Nov 28 11:48:16 np0005538960 podman[231801]: 2025-11-28 16:48:16.181961105 +0000 UTC m=+0.085383457 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=multipathd, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Nov 28 11:48:16 np0005538960 podman[231800]: 2025-11-28 16:48:16.197789602 +0000 UTC m=+0.101197774 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Nov 28 11:48:17 np0005538960 nova_compute[187252]: 2025-11-28 16:48:17.448 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:19 np0005538960 nova_compute[187252]: 2025-11-28 16:48:19.822 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:22 np0005538960 podman[231864]: 2025-11-28 16:48:22.138851128 +0000 UTC m=+0.048719002 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:48:22 np0005538960 nova_compute[187252]: 2025-11-28 16:48:22.449 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:24 np0005538960 nova_compute[187252]: 2025-11-28 16:48:24.823 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:26 np0005538960 podman[231888]: 2025-11-28 16:48:26.140157877 +0000 UTC m=+0.046919997 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, distribution-scope=public)
Nov 28 11:48:27 np0005538960 nova_compute[187252]: 2025-11-28 16:48:27.450 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:29 np0005538960 nova_compute[187252]: 2025-11-28 16:48:29.824 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:32 np0005538960 nova_compute[187252]: 2025-11-28 16:48:32.452 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:34 np0005538960 podman[231909]: 2025-11-28 16:48:34.140742346 +0000 UTC m=+0.051658064 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 11:48:34 np0005538960 nova_compute[187252]: 2025-11-28 16:48:34.826 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.321 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:48:35.322 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:48:37 np0005538960 nova_compute[187252]: 2025-11-28 16:48:37.455 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:39 np0005538960 podman[231929]: 2025-11-28 16:48:39.171319496 +0000 UTC m=+0.071303694 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:48:39 np0005538960 nova_compute[187252]: 2025-11-28 16:48:39.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:39 np0005538960 nova_compute[187252]: 2025-11-28 16:48:39.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:48:39 np0005538960 nova_compute[187252]: 2025-11-28 16:48:39.830 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:40 np0005538960 nova_compute[187252]: 2025-11-28 16:48:40.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.314 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.331 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.331 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.358 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.359 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.359 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.359 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.514 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.515 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5717MB free_disk=73.3373794555664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.515 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.515 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.688 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.688 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.777 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.791 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.792 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:48:41 np0005538960 nova_compute[187252]: 2025-11-28 16:48:41.792 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:48:42 np0005538960 nova_compute[187252]: 2025-11-28 16:48:42.457 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:42 np0005538960 nova_compute[187252]: 2025-11-28 16:48:42.789 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:43 np0005538960 nova_compute[187252]: 2025-11-28 16:48:43.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:44 np0005538960 nova_compute[187252]: 2025-11-28 16:48:44.831 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:47 np0005538960 podman[231958]: 2025-11-28 16:48:47.176694435 +0000 UTC m=+0.071264963 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:48:47 np0005538960 podman[231957]: 2025-11-28 16:48:47.183788429 +0000 UTC m=+0.088153556 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:48:47 np0005538960 podman[231956]: 2025-11-28 16:48:47.221793777 +0000 UTC m=+0.131524396 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Nov 28 11:48:47 np0005538960 nova_compute[187252]: 2025-11-28 16:48:47.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:47 np0005538960 nova_compute[187252]: 2025-11-28 16:48:47.459 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:49 np0005538960 nova_compute[187252]: 2025-11-28 16:48:49.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:49 np0005538960 nova_compute[187252]: 2025-11-28 16:48:49.833 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:50 np0005538960 nova_compute[187252]: 2025-11-28 16:48:50.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:48:52 np0005538960 nova_compute[187252]: 2025-11-28 16:48:52.461 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:53 np0005538960 podman[232021]: 2025-11-28 16:48:53.151320752 +0000 UTC m=+0.052671279 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:48:54 np0005538960 nova_compute[187252]: 2025-11-28 16:48:54.874 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:57 np0005538960 podman[232045]: 2025-11-28 16:48:57.147894226 +0000 UTC m=+0.057975418 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, name=ubi9-minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:48:57 np0005538960 nova_compute[187252]: 2025-11-28 16:48:57.463 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:48:59 np0005538960 nova_compute[187252]: 2025-11-28 16:48:59.875 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:02 np0005538960 nova_compute[187252]: 2025-11-28 16:49:02.466 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:04 np0005538960 nova_compute[187252]: 2025-11-28 16:49:04.876 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:05 np0005538960 podman[232067]: 2025-11-28 16:49:05.167121932 +0000 UTC m=+0.069148022 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:49:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:49:06.377 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:49:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:49:06.377 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:49:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:49:06.377 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:49:07 np0005538960 nova_compute[187252]: 2025-11-28 16:49:07.468 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:09 np0005538960 nova_compute[187252]: 2025-11-28 16:49:09.878 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:10 np0005538960 podman[232087]: 2025-11-28 16:49:10.200345665 +0000 UTC m=+0.084289251 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:49:12 np0005538960 nova_compute[187252]: 2025-11-28 16:49:12.470 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:14 np0005538960 nova_compute[187252]: 2025-11-28 16:49:14.881 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:17 np0005538960 nova_compute[187252]: 2025-11-28 16:49:17.472 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:18 np0005538960 podman[232113]: 2025-11-28 16:49:18.156731816 +0000 UTC m=+0.057003024 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:49:18 np0005538960 podman[232111]: 2025-11-28 16:49:18.172801338 +0000 UTC m=+0.081588764 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125)
Nov 28 11:49:18 np0005538960 podman[232112]: 2025-11-28 16:49:18.185251953 +0000 UTC m=+0.088765840 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:49:19 np0005538960 nova_compute[187252]: 2025-11-28 16:49:19.883 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:22 np0005538960 nova_compute[187252]: 2025-11-28 16:49:22.474 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:24 np0005538960 podman[232168]: 2025-11-28 16:49:24.137712399 +0000 UTC m=+0.045295399 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:49:24 np0005538960 nova_compute[187252]: 2025-11-28 16:49:24.885 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:27 np0005538960 nova_compute[187252]: 2025-11-28 16:49:27.477 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:28 np0005538960 podman[232192]: 2025-11-28 16:49:28.156741132 +0000 UTC m=+0.063409740 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Nov 28 11:49:29 np0005538960 nova_compute[187252]: 2025-11-28 16:49:29.886 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:32 np0005538960 nova_compute[187252]: 2025-11-28 16:49:32.479 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:34 np0005538960 nova_compute[187252]: 2025-11-28 16:49:34.889 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:35 np0005538960 podman[232213]: 2025-11-28 16:49:35.425939055 +0000 UTC m=+0.078626853 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Nov 28 11:49:37 np0005538960 nova_compute[187252]: 2025-11-28 16:49:37.481 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:39 np0005538960 nova_compute[187252]: 2025-11-28 16:49:39.890 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:40 np0005538960 nova_compute[187252]: 2025-11-28 16:49:40.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:41 np0005538960 podman[232233]: 2025-11-28 16:49:41.165470934 +0000 UTC m=+0.066900136 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.344 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.344 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.345 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.345 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.505 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.506 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5716MB free_disk=73.3373794555664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.507 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.507 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.583 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.584 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.628 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.640 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.642 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:49:41 np0005538960 nova_compute[187252]: 2025-11-28 16:49:41.642 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:49:42 np0005538960 nova_compute[187252]: 2025-11-28 16:49:42.484 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:43 np0005538960 nova_compute[187252]: 2025-11-28 16:49:43.642 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:43 np0005538960 nova_compute[187252]: 2025-11-28 16:49:43.642 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:49:43 np0005538960 nova_compute[187252]: 2025-11-28 16:49:43.642 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:49:43 np0005538960 nova_compute[187252]: 2025-11-28 16:49:43.659 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:49:43 np0005538960 nova_compute[187252]: 2025-11-28 16:49:43.659 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:44 np0005538960 nova_compute[187252]: 2025-11-28 16:49:44.328 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:44 np0005538960 nova_compute[187252]: 2025-11-28 16:49:44.892 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:47 np0005538960 nova_compute[187252]: 2025-11-28 16:49:47.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:47 np0005538960 nova_compute[187252]: 2025-11-28 16:49:47.486 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:48 np0005538960 nova_compute[187252]: 2025-11-28 16:49:48.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:49 np0005538960 podman[232261]: 2025-11-28 16:49:49.189253634 +0000 UTC m=+0.084305861 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible)
Nov 28 11:49:49 np0005538960 podman[232260]: 2025-11-28 16:49:49.191636923 +0000 UTC m=+0.087636893 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, tcib_managed=true)
Nov 28 11:49:49 np0005538960 podman[232259]: 2025-11-28 16:49:49.216787166 +0000 UTC m=+0.120859214 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:49:49 np0005538960 nova_compute[187252]: 2025-11-28 16:49:49.895 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:50 np0005538960 nova_compute[187252]: 2025-11-28 16:49:50.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:52 np0005538960 nova_compute[187252]: 2025-11-28 16:49:52.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:49:52 np0005538960 nova_compute[187252]: 2025-11-28 16:49:52.488 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:54 np0005538960 nova_compute[187252]: 2025-11-28 16:49:54.895 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:55 np0005538960 podman[232320]: 2025-11-28 16:49:55.165807555 +0000 UTC m=+0.074989984 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Nov 28 11:49:57 np0005538960 nova_compute[187252]: 2025-11-28 16:49:57.490 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:49:59 np0005538960 podman[232345]: 2025-11-28 16:49:59.153833183 +0000 UTC m=+0.058603383 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.buildah.version=1.33.7, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Nov 28 11:49:59 np0005538960 nova_compute[187252]: 2025-11-28 16:49:59.899 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:02 np0005538960 nova_compute[187252]: 2025-11-28 16:50:02.492 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:04 np0005538960 nova_compute[187252]: 2025-11-28 16:50:04.900 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:06 np0005538960 podman[232366]: 2025-11-28 16:50:06.144737561 +0000 UTC m=+0.054508374 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Nov 28 11:50:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:50:06.378 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:50:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:50:06.378 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:50:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:50:06.378 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:50:07 np0005538960 nova_compute[187252]: 2025-11-28 16:50:07.494 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:09 np0005538960 nova_compute[187252]: 2025-11-28 16:50:09.902 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:12 np0005538960 podman[232389]: 2025-11-28 16:50:12.134763274 +0000 UTC m=+0.046677632 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Nov 28 11:50:12 np0005538960 nova_compute[187252]: 2025-11-28 16:50:12.517 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:14 np0005538960 nova_compute[187252]: 2025-11-28 16:50:14.903 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:17 np0005538960 nova_compute[187252]: 2025-11-28 16:50:17.520 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:19 np0005538960 nova_compute[187252]: 2025-11-28 16:50:19.905 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:20 np0005538960 podman[232417]: 2025-11-28 16:50:20.149458371 +0000 UTC m=+0.043996705 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 11:50:20 np0005538960 podman[232416]: 2025-11-28 16:50:20.160863 +0000 UTC m=+0.055453255 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251125, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Nov 28 11:50:20 np0005538960 podman[232415]: 2025-11-28 16:50:20.178622744 +0000 UTC m=+0.078492699 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:50:21 np0005538960 nova_compute[187252]: 2025-11-28 16:50:21.690 187256 DEBUG oslo_concurrency.processutils [None req-0b3f908c-dddd-48c7-ae08-fa8cc219ea75 be9632161e8d47dea81ff52a3302fd52 0d115147376746f886db4c9ce486a477 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Nov 28 11:50:21 np0005538960 nova_compute[187252]: 2025-11-28 16:50:21.708 187256 DEBUG oslo_concurrency.processutils [None req-0b3f908c-dddd-48c7-ae08-fa8cc219ea75 be9632161e8d47dea81ff52a3302fd52 0d115147376746f886db4c9ce486a477 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Nov 28 11:50:22 np0005538960 nova_compute[187252]: 2025-11-28 16:50:22.522 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:24 np0005538960 nova_compute[187252]: 2025-11-28 16:50:24.939 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:26 np0005538960 podman[232479]: 2025-11-28 16:50:26.449797458 +0000 UTC m=+0.360391029 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Nov 28 11:50:27 np0005538960 nova_compute[187252]: 2025-11-28 16:50:27.524 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:27 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:50:27.591 104369 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '2e:a5:ca', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'b2:e2:58:ba:57:87'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Nov 28 11:50:27 np0005538960 nova_compute[187252]: 2025-11-28 16:50:27.591 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:27 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:50:27.592 104369 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Nov 28 11:50:29 np0005538960 nova_compute[187252]: 2025-11-28 16:50:29.941 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:30 np0005538960 podman[232503]: 2025-11-28 16:50:30.142019862 +0000 UTC m=+0.047962613 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Nov 28 11:50:32 np0005538960 nova_compute[187252]: 2025-11-28 16:50:32.527 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:34 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:50:34.594 104369 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=ac0d1e81-02b2-487b-bc65-46ccb331e9e4, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Nov 28 11:50:34 np0005538960 nova_compute[187252]: 2025-11-28 16:50:34.976 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:50:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:50:37 np0005538960 podman[232526]: 2025-11-28 16:50:37.140755853 +0000 UTC m=+0.050015253 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Nov 28 11:50:37 np0005538960 nova_compute[187252]: 2025-11-28 16:50:37.528 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:39 np0005538960 nova_compute[187252]: 2025-11-28 16:50:39.977 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:41 np0005538960 nova_compute[187252]: 2025-11-28 16:50:41.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:41 np0005538960 nova_compute[187252]: 2025-11-28 16:50:41.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:41 np0005538960 nova_compute[187252]: 2025-11-28 16:50:41.316 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:50:42 np0005538960 nova_compute[187252]: 2025-11-28 16:50:42.530 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:43 np0005538960 podman[232546]: 2025-11-28 16:50:43.154806993 +0000 UTC m=+0.051839489 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.347 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.348 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.348 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.348 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.489 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.490 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5714MB free_disk=73.3373794555664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.491 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.491 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.561 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.562 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.581 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing inventories for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.617 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating ProviderTree inventory for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.618 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Updating inventory in ProviderTree for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.657 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing aggregate associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.682 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Refreshing trait associations for resource provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce, traits: COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NODE,COMPUTE_ACCELERATORS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SATA,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSSE3,HW_CPU_X86_SSE42,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SECURITY_TPM_2_0,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_SSE41,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.698 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.719 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.721 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:50:43 np0005538960 nova_compute[187252]: 2025-11-28 16:50:43.721 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.230s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:50:44 np0005538960 nova_compute[187252]: 2025-11-28 16:50:44.978 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:45 np0005538960 nova_compute[187252]: 2025-11-28 16:50:45.722 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:45 np0005538960 nova_compute[187252]: 2025-11-28 16:50:45.722 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:50:45 np0005538960 nova_compute[187252]: 2025-11-28 16:50:45.723 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:50:45 np0005538960 nova_compute[187252]: 2025-11-28 16:50:45.733 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:50:46 np0005538960 nova_compute[187252]: 2025-11-28 16:50:46.321 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:47 np0005538960 nova_compute[187252]: 2025-11-28 16:50:47.532 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:49 np0005538960 nova_compute[187252]: 2025-11-28 16:50:49.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:49 np0005538960 nova_compute[187252]: 2025-11-28 16:50:49.980 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:51 np0005538960 podman[232572]: 2025-11-28 16:50:51.205013815 +0000 UTC m=+0.093805184 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:50:51 np0005538960 podman[232571]: 2025-11-28 16:50:51.204735448 +0000 UTC m=+0.102961027 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller)
Nov 28 11:50:51 np0005538960 podman[232578]: 2025-11-28 16:50:51.210819296 +0000 UTC m=+0.086168836 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Nov 28 11:50:51 np0005538960 nova_compute[187252]: 2025-11-28 16:50:51.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:52 np0005538960 nova_compute[187252]: 2025-11-28 16:50:52.580 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:53 np0005538960 nova_compute[187252]: 2025-11-28 16:50:53.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:50:54 np0005538960 nova_compute[187252]: 2025-11-28 16:50:54.981 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:57 np0005538960 podman[232632]: 2025-11-28 16:50:57.168763736 +0000 UTC m=+0.071895739 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:50:57 np0005538960 nova_compute[187252]: 2025-11-28 16:50:57.582 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:50:59 np0005538960 nova_compute[187252]: 2025-11-28 16:50:59.983 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:01 np0005538960 podman[232657]: 2025-11-28 16:51:01.146460729 +0000 UTC m=+0.056743028 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Nov 28 11:51:02 np0005538960 nova_compute[187252]: 2025-11-28 16:51:02.586 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:04 np0005538960 nova_compute[187252]: 2025-11-28 16:51:04.984 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:51:06.379 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:51:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:51:06.380 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:51:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:51:06.380 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.315 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.316 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.316 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.316 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.316 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.317 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.489 187256 DEBUG nova.virt.libvirt.imagecache [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.490 187256 WARNING nova.virt.libvirt.imagecache [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.490 187256 INFO nova.virt.libvirt.imagecache [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Removable base files: /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.490 187256 INFO nova.virt.libvirt.imagecache [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/a5028f54b566615edf539c536ce9ee5ddf1d51dc#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.490 187256 DEBUG nova.virt.libvirt.imagecache [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.491 187256 DEBUG nova.virt.libvirt.imagecache [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.491 187256 DEBUG nova.virt.libvirt.imagecache [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Nov 28 11:51:07 np0005538960 nova_compute[187252]: 2025-11-28 16:51:07.587 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:08 np0005538960 podman[232678]: 2025-11-28 16:51:08.151200487 +0000 UTC m=+0.063876643 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Nov 28 11:51:09 np0005538960 nova_compute[187252]: 2025-11-28 16:51:09.985 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:12 np0005538960 nova_compute[187252]: 2025-11-28 16:51:12.590 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:14 np0005538960 podman[232699]: 2025-11-28 16:51:14.138731109 +0000 UTC m=+0.048950657 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:51:14 np0005538960 nova_compute[187252]: 2025-11-28 16:51:14.987 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:17 np0005538960 nova_compute[187252]: 2025-11-28 16:51:17.592 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:19 np0005538960 nova_compute[187252]: 2025-11-28 16:51:19.990 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:22 np0005538960 podman[232725]: 2025-11-28 16:51:22.158744554 +0000 UTC m=+0.053385066 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:51:22 np0005538960 podman[232723]: 2025-11-28 16:51:22.191426482 +0000 UTC m=+0.094848438 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Nov 28 11:51:22 np0005538960 podman[232724]: 2025-11-28 16:51:22.191484214 +0000 UTC m=+0.089530759 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251125, container_name=multipathd, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Nov 28 11:51:22 np0005538960 nova_compute[187252]: 2025-11-28 16:51:22.594 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:24 np0005538960 nova_compute[187252]: 2025-11-28 16:51:24.990 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:27 np0005538960 nova_compute[187252]: 2025-11-28 16:51:27.597 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:28 np0005538960 podman[232783]: 2025-11-28 16:51:28.146940683 +0000 UTC m=+0.056747438 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:51:28 np0005538960 nova_compute[187252]: 2025-11-28 16:51:28.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:29 np0005538960 nova_compute[187252]: 2025-11-28 16:51:29.992 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:32 np0005538960 podman[232809]: 2025-11-28 16:51:32.143961929 +0000 UTC m=+0.054872573 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9-minimal)
Nov 28 11:51:32 np0005538960 nova_compute[187252]: 2025-11-28 16:51:32.637 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:34 np0005538960 nova_compute[187252]: 2025-11-28 16:51:34.993 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:37 np0005538960 nova_compute[187252]: 2025-11-28 16:51:37.639 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:39 np0005538960 podman[232832]: 2025-11-28 16:51:39.20815656 +0000 UTC m=+0.103057771 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Nov 28 11:51:39 np0005538960 nova_compute[187252]: 2025-11-28 16:51:39.995 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:42 np0005538960 nova_compute[187252]: 2025-11-28 16:51:42.325 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:42 np0005538960 nova_compute[187252]: 2025-11-28 16:51:42.325 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:42 np0005538960 nova_compute[187252]: 2025-11-28 16:51:42.326 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:51:42 np0005538960 nova_compute[187252]: 2025-11-28 16:51:42.708 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:43 np0005538960 nova_compute[187252]: 2025-11-28 16:51:43.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:44 np0005538960 nova_compute[187252]: 2025-11-28 16:51:44.997 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:45 np0005538960 podman[232854]: 2025-11-28 16:51:45.14282189 +0000 UTC m=+0.052559245 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.359 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.360 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.360 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.360 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.525 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.526 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5723MB free_disk=73.33737564086914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.526 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.526 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.591 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.592 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.621 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.636 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.638 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:51:45 np0005538960 nova_compute[187252]: 2025-11-28 16:51:45.638 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:51:46 np0005538960 nova_compute[187252]: 2025-11-28 16:51:46.640 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:46 np0005538960 nova_compute[187252]: 2025-11-28 16:51:46.640 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:51:46 np0005538960 nova_compute[187252]: 2025-11-28 16:51:46.641 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:51:46 np0005538960 nova_compute[187252]: 2025-11-28 16:51:46.652 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:51:47 np0005538960 nova_compute[187252]: 2025-11-28 16:51:47.323 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:47 np0005538960 nova_compute[187252]: 2025-11-28 16:51:47.711 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:49 np0005538960 nova_compute[187252]: 2025-11-28 16:51:49.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:50 np0005538960 nova_compute[187252]: 2025-11-28 16:51:49.999 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:52 np0005538960 nova_compute[187252]: 2025-11-28 16:51:52.715 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:53 np0005538960 podman[232880]: 2025-11-28 16:51:53.170302878 +0000 UTC m=+0.068913485 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Nov 28 11:51:53 np0005538960 podman[232879]: 2025-11-28 16:51:53.17611159 +0000 UTC m=+0.081359260 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Nov 28 11:51:53 np0005538960 podman[232881]: 2025-11-28 16:51:53.191692601 +0000 UTC m=+0.087094010 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251125, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Nov 28 11:51:53 np0005538960 nova_compute[187252]: 2025-11-28 16:51:53.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:53 np0005538960 nova_compute[187252]: 2025-11-28 16:51:53.327 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:54 np0005538960 nova_compute[187252]: 2025-11-28 16:51:54.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:51:55 np0005538960 nova_compute[187252]: 2025-11-28 16:51:55.000 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:57 np0005538960 nova_compute[187252]: 2025-11-28 16:51:57.719 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:51:59 np0005538960 podman[232946]: 2025-11-28 16:51:59.149102327 +0000 UTC m=+0.054372781 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:52:00 np0005538960 nova_compute[187252]: 2025-11-28 16:52:00.000 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:00 np0005538960 nova_compute[187252]: 2025-11-28 16:52:00.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:00 np0005538960 nova_compute[187252]: 2025-11-28 16:52:00.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Nov 28 11:52:00 np0005538960 nova_compute[187252]: 2025-11-28 16:52:00.328 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Nov 28 11:52:01 np0005538960 nova_compute[187252]: 2025-11-28 16:52:01.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:01 np0005538960 nova_compute[187252]: 2025-11-28 16:52:01.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Nov 28 11:52:02 np0005538960 nova_compute[187252]: 2025-11-28 16:52:02.721 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:03 np0005538960 podman[232971]: 2025-11-28 16:52:03.139987802 +0000 UTC m=+0.052872413 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-type=git, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, version=9.6, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Nov 28 11:52:05 np0005538960 nova_compute[187252]: 2025-11-28 16:52:05.001 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:52:06.380 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:52:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:52:06.380 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:52:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:52:06.381 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:52:07 np0005538960 nova_compute[187252]: 2025-11-28 16:52:07.724 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:10 np0005538960 nova_compute[187252]: 2025-11-28 16:52:10.002 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:10 np0005538960 podman[232992]: 2025-11-28 16:52:10.145355775 +0000 UTC m=+0.054821012 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:52:12 np0005538960 nova_compute[187252]: 2025-11-28 16:52:12.725 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:15 np0005538960 nova_compute[187252]: 2025-11-28 16:52:15.004 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:16 np0005538960 podman[233012]: 2025-11-28 16:52:16.164612203 +0000 UTC m=+0.065139563 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:52:17 np0005538960 nova_compute[187252]: 2025-11-28 16:52:17.727 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:20 np0005538960 nova_compute[187252]: 2025-11-28 16:52:20.005 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:22 np0005538960 nova_compute[187252]: 2025-11-28 16:52:22.730 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:24 np0005538960 podman[233038]: 2025-11-28 16:52:24.147346716 +0000 UTC m=+0.049835039 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Nov 28 11:52:24 np0005538960 podman[233037]: 2025-11-28 16:52:24.162653111 +0000 UTC m=+0.064538568 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Nov 28 11:52:24 np0005538960 podman[233036]: 2025-11-28 16:52:24.196954578 +0000 UTC m=+0.105962019 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:52:25 np0005538960 nova_compute[187252]: 2025-11-28 16:52:25.007 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:27 np0005538960 nova_compute[187252]: 2025-11-28 16:52:27.732 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:30 np0005538960 nova_compute[187252]: 2025-11-28 16:52:30.007 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:30 np0005538960 podman[233098]: 2025-11-28 16:52:30.132848338 +0000 UTC m=+0.043301039 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:52:32 np0005538960 nova_compute[187252]: 2025-11-28 16:52:32.734 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:34 np0005538960 podman[233122]: 2025-11-28 16:52:34.149245387 +0000 UTC m=+0.057568548 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, name=ubi9-minimal)
Nov 28 11:52:35 np0005538960 nova_compute[187252]: 2025-11-28 16:52:35.007 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:52:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:52:37 np0005538960 nova_compute[187252]: 2025-11-28 16:52:37.737 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:39 np0005538960 nova_compute[187252]: 2025-11-28 16:52:39.418 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:40 np0005538960 nova_compute[187252]: 2025-11-28 16:52:40.008 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:41 np0005538960 podman[233143]: 2025-11-28 16:52:41.144735528 +0000 UTC m=+0.054802071 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, config_id=edpm, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Nov 28 11:52:42 np0005538960 nova_compute[187252]: 2025-11-28 16:52:42.738 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:43 np0005538960 nova_compute[187252]: 2025-11-28 16:52:43.332 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:43 np0005538960 nova_compute[187252]: 2025-11-28 16:52:43.332 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:44 np0005538960 nova_compute[187252]: 2025-11-28 16:52:44.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:44 np0005538960 nova_compute[187252]: 2025-11-28 16:52:44.314 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.033 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.339 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.340 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.470 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.471 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5735MB free_disk=73.33737564086914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.471 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.472 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.544 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.544 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.585 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.604 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.605 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:52:45 np0005538960 nova_compute[187252]: 2025-11-28 16:52:45.605 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:52:46 np0005538960 nova_compute[187252]: 2025-11-28 16:52:46.606 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:46 np0005538960 nova_compute[187252]: 2025-11-28 16:52:46.607 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:52:46 np0005538960 nova_compute[187252]: 2025-11-28 16:52:46.607 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:52:46 np0005538960 nova_compute[187252]: 2025-11-28 16:52:46.620 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:52:47 np0005538960 podman[233163]: 2025-11-28 16:52:47.17563458 +0000 UTC m=+0.077173388 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:52:47 np0005538960 nova_compute[187252]: 2025-11-28 16:52:47.741 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:48 np0005538960 nova_compute[187252]: 2025-11-28 16:52:48.324 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:49 np0005538960 nova_compute[187252]: 2025-11-28 16:52:49.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:50 np0005538960 nova_compute[187252]: 2025-11-28 16:52:50.035 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:52 np0005538960 nova_compute[187252]: 2025-11-28 16:52:52.743 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:55 np0005538960 nova_compute[187252]: 2025-11-28 16:52:55.037 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:52:55 np0005538960 podman[233189]: 2025-11-28 16:52:55.153588937 +0000 UTC m=+0.050055344 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125)
Nov 28 11:52:55 np0005538960 podman[233188]: 2025-11-28 16:52:55.154125121 +0000 UTC m=+0.057428054 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:52:55 np0005538960 podman[233187]: 2025-11-28 16:52:55.17621636 +0000 UTC m=+0.085368607 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251125, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Nov 28 11:52:55 np0005538960 nova_compute[187252]: 2025-11-28 16:52:55.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:55 np0005538960 nova_compute[187252]: 2025-11-28 16:52:55.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:52:57 np0005538960 nova_compute[187252]: 2025-11-28 16:52:57.745 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:00 np0005538960 nova_compute[187252]: 2025-11-28 16:53:00.039 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:01 np0005538960 podman[233252]: 2025-11-28 16:53:01.136742622 +0000 UTC m=+0.047673316 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:53:02 np0005538960 nova_compute[187252]: 2025-11-28 16:53:02.747 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:05 np0005538960 nova_compute[187252]: 2025-11-28 16:53:05.042 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:05 np0005538960 podman[233277]: 2025-11-28 16:53:05.142227652 +0000 UTC m=+0.053433486 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible)
Nov 28 11:53:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:53:06.381 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:53:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:53:06.381 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:53:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:53:06.381 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:53:07 np0005538960 nova_compute[187252]: 2025-11-28 16:53:07.785 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:10 np0005538960 nova_compute[187252]: 2025-11-28 16:53:10.044 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:12 np0005538960 podman[233298]: 2025-11-28 16:53:12.138718811 +0000 UTC m=+0.047948840 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, config_id=edpm)
Nov 28 11:53:12 np0005538960 nova_compute[187252]: 2025-11-28 16:53:12.788 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:15 np0005538960 nova_compute[187252]: 2025-11-28 16:53:15.046 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:17 np0005538960 nova_compute[187252]: 2025-11-28 16:53:17.846 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:18 np0005538960 podman[233318]: 2025-11-28 16:53:18.137741667 +0000 UTC m=+0.047195312 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Nov 28 11:53:20 np0005538960 nova_compute[187252]: 2025-11-28 16:53:20.048 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:22 np0005538960 nova_compute[187252]: 2025-11-28 16:53:22.848 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:25 np0005538960 nova_compute[187252]: 2025-11-28 16:53:25.050 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:26 np0005538960 podman[233346]: 2025-11-28 16:53:26.147862324 +0000 UTC m=+0.053256653 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3)
Nov 28 11:53:26 np0005538960 podman[233347]: 2025-11-28 16:53:26.14971836 +0000 UTC m=+0.051192612 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS)
Nov 28 11:53:26 np0005538960 podman[233345]: 2025-11-28 16:53:26.16828647 +0000 UTC m=+0.076714325 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Nov 28 11:53:27 np0005538960 nova_compute[187252]: 2025-11-28 16:53:27.851 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:30 np0005538960 nova_compute[187252]: 2025-11-28 16:53:30.050 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:32 np0005538960 podman[233405]: 2025-11-28 16:53:32.137817322 +0000 UTC m=+0.049865749 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Nov 28 11:53:32 np0005538960 nova_compute[187252]: 2025-11-28 16:53:32.854 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:35 np0005538960 nova_compute[187252]: 2025-11-28 16:53:35.053 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:36 np0005538960 podman[233431]: 2025-11-28 16:53:36.148228653 +0000 UTC m=+0.057334234 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Nov 28 11:53:37 np0005538960 nova_compute[187252]: 2025-11-28 16:53:37.857 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:40 np0005538960 nova_compute[187252]: 2025-11-28 16:53:40.055 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:42 np0005538960 nova_compute[187252]: 2025-11-28 16:53:42.859 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:43 np0005538960 podman[233452]: 2025-11-28 16:53:43.14781406 +0000 UTC m=+0.057880457 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251125, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:53:43 np0005538960 nova_compute[187252]: 2025-11-28 16:53:43.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:43 np0005538960 nova_compute[187252]: 2025-11-28 16:53:43.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.055 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.315 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.316 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.337 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.338 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.489 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.490 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5731MB free_disk=73.33737564086914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.490 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.490 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.616 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.617 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.705 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.718 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.719 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:53:45 np0005538960 nova_compute[187252]: 2025-11-28 16:53:45.719 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:53:47 np0005538960 nova_compute[187252]: 2025-11-28 16:53:47.720 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:47 np0005538960 nova_compute[187252]: 2025-11-28 16:53:47.721 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:53:47 np0005538960 nova_compute[187252]: 2025-11-28 16:53:47.721 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:53:47 np0005538960 nova_compute[187252]: 2025-11-28 16:53:47.736 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:53:47 np0005538960 nova_compute[187252]: 2025-11-28 16:53:47.910 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:48 np0005538960 nova_compute[187252]: 2025-11-28 16:53:48.326 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:49 np0005538960 podman[233473]: 2025-11-28 16:53:49.131844743 +0000 UTC m=+0.043222134 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:53:49 np0005538960 nova_compute[187252]: 2025-11-28 16:53:49.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:50 np0005538960 nova_compute[187252]: 2025-11-28 16:53:50.058 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:52 np0005538960 nova_compute[187252]: 2025-11-28 16:53:52.913 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:55 np0005538960 nova_compute[187252]: 2025-11-28 16:53:55.113 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:53:56 np0005538960 nova_compute[187252]: 2025-11-28 16:53:56.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:56 np0005538960 nova_compute[187252]: 2025-11-28 16:53:56.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:57 np0005538960 podman[233497]: 2025-11-28 16:53:57.153876886 +0000 UTC m=+0.054380151 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251125, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true)
Nov 28 11:53:57 np0005538960 podman[233498]: 2025-11-28 16:53:57.177763088 +0000 UTC m=+0.075005013 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 11:53:57 np0005538960 podman[233496]: 2025-11-28 16:53:57.178298281 +0000 UTC m=+0.083531834 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251125)
Nov 28 11:53:57 np0005538960 nova_compute[187252]: 2025-11-28 16:53:57.311 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:53:57 np0005538960 nova_compute[187252]: 2025-11-28 16:53:57.915 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:00 np0005538960 nova_compute[187252]: 2025-11-28 16:54:00.115 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:02 np0005538960 nova_compute[187252]: 2025-11-28 16:54:02.916 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:03 np0005538960 podman[233557]: 2025-11-28 16:54:03.141399195 +0000 UTC m=+0.046766091 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Nov 28 11:54:05 np0005538960 nova_compute[187252]: 2025-11-28 16:54:05.117 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:54:06.383 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:54:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:54:06.384 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:54:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:54:06.384 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:54:07 np0005538960 podman[233581]: 2025-11-28 16:54:07.137705136 +0000 UTC m=+0.048394900 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-type=git, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers)
Nov 28 11:54:07 np0005538960 nova_compute[187252]: 2025-11-28 16:54:07.919 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:10 np0005538960 nova_compute[187252]: 2025-11-28 16:54:10.118 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:12 np0005538960 nova_compute[187252]: 2025-11-28 16:54:12.922 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:14 np0005538960 podman[233602]: 2025-11-28 16:54:14.137967161 +0000 UTC m=+0.049739346 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Nov 28 11:54:15 np0005538960 nova_compute[187252]: 2025-11-28 16:54:15.121 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:17 np0005538960 nova_compute[187252]: 2025-11-28 16:54:17.924 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:20 np0005538960 nova_compute[187252]: 2025-11-28 16:54:20.122 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:20 np0005538960 podman[233623]: 2025-11-28 16:54:20.146683406 +0000 UTC m=+0.056851752 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:54:22 np0005538960 nova_compute[187252]: 2025-11-28 16:54:22.927 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:25 np0005538960 nova_compute[187252]: 2025-11-28 16:54:25.124 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:27 np0005538960 nova_compute[187252]: 2025-11-28 16:54:27.929 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:28 np0005538960 podman[233649]: 2025-11-28 16:54:28.177428815 +0000 UTC m=+0.048695350 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Nov 28 11:54:28 np0005538960 podman[233648]: 2025-11-28 16:54:28.177622309 +0000 UTC m=+0.051393647 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, io.buildah.version=1.41.3, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd)
Nov 28 11:54:28 np0005538960 podman[233647]: 2025-11-28 16:54:28.260552817 +0000 UTC m=+0.137788300 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.build-date=20251125, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Nov 28 11:54:30 np0005538960 nova_compute[187252]: 2025-11-28 16:54:30.126 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:32 np0005538960 nova_compute[187252]: 2025-11-28 16:54:32.930 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:34 np0005538960 podman[233707]: 2025-11-28 16:54:34.139861511 +0000 UTC m=+0.047588132 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:54:35 np0005538960 nova_compute[187252]: 2025-11-28 16:54:35.126 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.318 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.319 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:35 np0005538960 ceilometer_agent_compute[198019]: 2025-11-28 16:54:35.320 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Nov 28 11:54:37 np0005538960 nova_compute[187252]: 2025-11-28 16:54:37.933 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:38 np0005538960 podman[233729]: 2025-11-28 16:54:38.204291984 +0000 UTC m=+0.105428807 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Nov 28 11:54:40 np0005538960 nova_compute[187252]: 2025-11-28 16:54:40.127 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:42 np0005538960 nova_compute[187252]: 2025-11-28 16:54:42.935 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.129 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:45 np0005538960 podman[233750]: 2025-11-28 16:54:45.145604274 +0000 UTC m=+0.053149720 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251125, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.338 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.338 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.465 187256 WARNING nova.virt.libvirt.driver [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.466 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5739MB free_disk=73.32962417602539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.466 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.467 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.516 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.516 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.533 187256 DEBUG nova.compute.provider_tree [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed in ProviderTree for provider: 65f0ce30-d9ca-4c16-b536-acd92f5f41ce update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.545 187256 DEBUG nova.scheduler.client.report [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Inventory has not changed for provider 65f0ce30-d9ca-4c16-b536-acd92f5f41ce based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.546 187256 DEBUG nova.compute.resource_tracker [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Nov 28 11:54:45 np0005538960 nova_compute[187252]: 2025-11-28 16:54:45.547 187256 DEBUG oslo_concurrency.lockutils [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.080s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:54:47 np0005538960 nova_compute[187252]: 2025-11-28 16:54:47.547 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:47 np0005538960 nova_compute[187252]: 2025-11-28 16:54:47.547 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Nov 28 11:54:47 np0005538960 nova_compute[187252]: 2025-11-28 16:54:47.937 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:49 np0005538960 nova_compute[187252]: 2025-11-28 16:54:49.310 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:49 np0005538960 nova_compute[187252]: 2025-11-28 16:54:49.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:49 np0005538960 nova_compute[187252]: 2025-11-28 16:54:49.314 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Nov 28 11:54:49 np0005538960 nova_compute[187252]: 2025-11-28 16:54:49.314 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Nov 28 11:54:49 np0005538960 nova_compute[187252]: 2025-11-28 16:54:49.328 187256 DEBUG nova.compute.manager [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Nov 28 11:54:49 np0005538960 nova_compute[187252]: 2025-11-28 16:54:49.329 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:50 np0005538960 nova_compute[187252]: 2025-11-28 16:54:50.131 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:51 np0005538960 podman[233770]: 2025-11-28 16:54:51.141078221 +0000 UTC m=+0.048024443 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:54:52 np0005538960 nova_compute[187252]: 2025-11-28 16:54:52.939 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:55 np0005538960 nova_compute[187252]: 2025-11-28 16:54:55.133 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:57 np0005538960 nova_compute[187252]: 2025-11-28 16:54:57.314 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:57 np0005538960 nova_compute[187252]: 2025-11-28 16:54:57.941 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:54:58 np0005538960 nova_compute[187252]: 2025-11-28 16:54:58.315 187256 DEBUG oslo_service.periodic_task [None req-d4d90e05-b0b1-4d08-b0b1-09054a2f7b44 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Nov 28 11:54:59 np0005538960 podman[233795]: 2025-11-28 16:54:59.149556416 +0000 UTC m=+0.056920174 container health_status 7e758bdfd87e3e0a49fce34fb6fd232fa7ac9cc2a1e91b3204e1a137bfc7763a (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Nov 28 11:54:59 np0005538960 podman[233794]: 2025-11-28 16:54:59.161620245 +0000 UTC m=+0.071201218 container health_status 36670a6b5f4328a60762fd6032ce9a3f498f51a5ac658e749d3e85097f285707 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125)
Nov 28 11:54:59 np0005538960 podman[233796]: 2025-11-28 16:54:59.165272926 +0000 UTC m=+0.068868490 container health_status a77bc5792b3b09c1160e25c2079f3dace9ff508ca0ae96be67e20fd8aa2dc6e4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251125, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Nov 28 11:55:00 np0005538960 nova_compute[187252]: 2025-11-28 16:55:00.134 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:02 np0005538960 nova_compute[187252]: 2025-11-28 16:55:02.944 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:05 np0005538960 podman[233858]: 2025-11-28 16:55:05.028005838 +0000 UTC m=+0.044589167 container health_status 9e7f99ae877d685021b8ae07cc72b124b3b5062da88171f8fa09545ac6980fb8 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Nov 28 11:55:05 np0005538960 nova_compute[187252]: 2025-11-28 16:55:05.136 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:55:06.384 104369 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Nov 28 11:55:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:55:06.384 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Nov 28 11:55:06 np0005538960 ovn_metadata_agent[104364]: 2025-11-28 16:55:06.384 104369 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Nov 28 11:55:07 np0005538960 nova_compute[187252]: 2025-11-28 16:55:07.946 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:09 np0005538960 podman[233882]: 2025-11-28 16:55:09.170800325 +0000 UTC m=+0.077687149 container health_status 1293741c323829ab11b752334ebcfeebe0a3664faf30c1ef8021a3f73ead8786 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9)
Nov 28 11:55:10 np0005538960 nova_compute[187252]: 2025-11-28 16:55:10.137 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:12 np0005538960 nova_compute[187252]: 2025-11-28 16:55:12.948 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:15 np0005538960 nova_compute[187252]: 2025-11-28 16:55:15.138 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:16 np0005538960 systemd-logind[788]: New session 43 of user zuul.
Nov 28 11:55:16 np0005538960 systemd[1]: Started Session 43 of User zuul.
Nov 28 11:55:16 np0005538960 podman[233907]: 2025-11-28 16:55:16.078380799 +0000 UTC m=+0.069391964 container health_status 1679becad2de39d12b38addb9a2a9d1cfefc2ba8846f5749b9ce0ff496be7683 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251125, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=1f5c0439f2433cb462b222a5bb23e629)
Nov 28 11:55:17 np0005538960 nova_compute[187252]: 2025-11-28 16:55:17.949 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:20 np0005538960 nova_compute[187252]: 2025-11-28 16:55:20.141 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:20 np0005538960 ovs-vsctl[234101]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Nov 28 11:55:21 np0005538960 virtqemud[186797]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Nov 28 11:55:21 np0005538960 virtqemud[186797]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Nov 28 11:55:21 np0005538960 virtqemud[186797]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Nov 28 11:55:21 np0005538960 podman[234296]: 2025-11-28 16:55:21.891227273 +0000 UTC m=+0.074743136 container health_status 7d77a20a6b6ab60a961f04051af237c5564d52e2f058fe5d381a9ddb4fc265ee (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Nov 28 11:55:22 np0005538960 nova_compute[187252]: 2025-11-28 16:55:22.956 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Nov 28 11:55:24 np0005538960 systemd[1]: Starting Hostname Service...
Nov 28 11:55:24 np0005538960 systemd[1]: Started Hostname Service.
Nov 28 11:55:25 np0005538960 nova_compute[187252]: 2025-11-28 16:55:25.141 187256 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
